Caveat 2.10.21
Ep 64 | 2.10.21

Covid's effects on medical privacy.

Transcript

Jenna Waters: I definitely think that the entire security posture of the health care industry has been really damaged by COVID-19.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben looks at a tool that can help determine if your image is part of a facial recognition library. I've got the story of law enforcement dodging public records rules through the use of encrypted messaging apps. And later in the show, my conversation with Jenna Waters from True Digital Security. We'll be looking back at the last year of COVID and how that's affected privacy, particularly in the medical field. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we've got some good stories this week. Why don't you start things off for us? 

Ben Yelin: So mine comes from what the former president would call the failing New York Times. 

Dave Bittner: (Laughter). 

Ben Yelin: But I think they write some great articles, and this is one of them from their technology section... 

Dave Bittner: Right. 

Ben Yelin: ...Written by Kashmir Hill and Cade Metz. This is about a new online tool called Exposing.ai that lets individuals search many image collections that companies use to build up their artificial intelligence systems. This tool matches images from Flickr, which is an online photo-sharing service that most of us haven't heard about or used for the past 15 years. 

Dave Bittner: (Laughter). 

Ben Yelin: But it offers us a window into how much data is going into these artificial intelligence technologies. So this tool allows you, either by posting the URL to a Flickr photo or the metadata or a username or a tag, to see whether your photos make up part of the database that feeds into the machine learning of AI. And it's really illuminating. 

Ben Yelin: Now, personally, I don't have any photos on Flickr. I did not make use of that service. I felt like that was something that younger kids did when I got older and older people did when I was younger. 

Dave Bittner: (Laughter). 

Ben Yelin: I was never in the proper age range to be a Flickr user. So it is limited to that photo-sharing service. 

Dave Bittner: Yeah. 

Ben Yelin: But is very illuminating. These databases that help AI learn how to operate consist of millions of photos, including some photos of us, of ourselves. 

Ben Yelin: Now, the reason this is potentially consequential is some of these AI systems are being used by some of the worst actors in the world. Most notably, the Chinese government is using it against the Uighurs, perpetuating what has essentially become a genocide in China. 

Ben Yelin: So I think it's kind of a wake-up call for people to see that some of the information that is going into making up these systems - these systems that can have very detrimental effects potentially on human rights - are coming from our photos against our will and inadvertently. And even though this particular database is only using Flickr photos, we know that in building up artificial intelligence systems, boatloads of data go in so, you know, that the AI system can engage in machine learning, can learn how to identify patterns based on our photos, based on our internet history, based on Wikipedia entries, et cetera. 

Ben Yelin: So this is just one of those articles that I think should open people's eyes that we're all contributing to this universe that's creating these artificial intelligence systems that potentially are having very detrimental effects on humanity, frankly. 

Dave Bittner: Yeah. This is fascinating. You know, there was a time 15 years ago, I guess, when I had a Flickr account and made use of it. It's been a while. But a colleague of mine who - his name is David Hobby. He runs a website called Strobist, which if you're a photographer, you've probably heard of. It's one of the most popular photo - sort of teaching photo sites around. So I did a quick hashtag search for the term Strobist on this site, and sure enough, up popped a picture of my friend David's son. 

Ben Yelin: Wow. 

Dave Bittner: Yeah, yeah. I mean (laughter)... 

Ben Yelin: So now you have some personal connection on this database. 

Dave Bittner: Yeah, among other - I mean, there's a lot of other photos that are in here because there's a - you know, #Strobist is probably a pretty common tag from back in the day. But interesting that of the photos that came up here, one of the top ones was, again, a picture of my friend's son here, which is - well, that kind of brings it home, doesn't it? 

Ben Yelin: Yeah, it makes it more real. You know, one thing that's also disturbing about this is a lot of researchers have built datasets for AI for the purposes of doing research. So they talk about this interface called MegaFace, which was built by scientists at the University of Washington several years ago. They built a system to help develop AI. They collected billions of photos without the consent of the people whose images they collected. And they made this an open-source tool so other people could use this as a research resource. 

Ben Yelin: It was downloaded 6,000 times, this article says, by companies and government agencies around the world, including entities like Northrop Grumman, one of our biggest defense contractors, and TikTok, which is a Chinese social media application. 

Ben Yelin: So even people who are developing AI systems for research purposes or think that it's going to have a limited reach, they're still collecting this relatively unlimited universe of data points - data points that have been obtained without the consent of the users that have been downloaded by powerful people and powerful institutions. Because this interface was downloaded so many times, the researchers at the University of Washington saw the detrimental effect it was having, and they actually took it offline. 

Ben Yelin: You know, I just think it goes to show you that everything we put on the internet goes out into the ether, and you never know how it's going to be used. It can be used against us when we're talking about criminal investigations, et cetera. It can be used to augment our experience, having personalized advertisements. 

Ben Yelin: But everything we put on the internet also can be used for something like building up an artificial intelligence system. And we don't think about that when we post our photos to Flickr. I just think it's something that's - and it's really good to help build awareness that everything that goes on the internet feeds into this ecosystem. And I just think it's not something that we really consider with much regularity. 

Dave Bittner: Yeah. You used the phrase that they're using these images without the people's consent - the people who are in these images. Do they need their consent? If these are posted publicly and available for scraping, do they have to ask? Should they have to ask? 

Ben Yelin: No. So in limited circumstances, depending on what state you're in - I think Illinois is the only state that requires consent for this type of photo-scraping. But in most circumstances, they don't have to ask. It's not required. You've posted it publicly. And once it's on the internet, it is certainly available for scraping. So there's definitely no legal obligation to obtain the consent of all of the users. And it would be impossible to build these vast AI systems and to augment machine learning if we had to obtain consent from the millions of people whose photos were scraped from the internet. So you can understand why we aren't requiring consent. I think that would inhibit the innovation of these systems. 

Ben Yelin: But, you know, on the other hand, it does mean that people need to be more careful, frankly, about what they're posting on the internet, or at least more mindful that things that can seem harmless, like your engagement photos, are potentially being used for something that you never would've anticipated when you posted them. 

Dave Bittner: Yeah, I think that's the part that can be so disturbing, which is just sort of out of the blue, you find that (laughter) something you just put out there for one reason is being used for another, and a reason, perhaps, you had never considered. 

Ben Yelin: Yeah. Yeah. I think it should make us just kind of more mindful of everything we post. I mean, maybe after reading this article, people will think twice. Like, what could what I'm about to post be possibly used for? Whether it's a social media post, whether it's a photo, it is going to be part, probably, of some machine-learning process just based on the vastness of the internet. 

Dave Bittner: Yeah. 

Ben Yelin: And so, you know, even if we still decide to post something, I think it's - this article will at least force me to think twice about it. And so maybe - you know, or not just this article, but the database itself. So maybe that's something that's useful. 

Dave Bittner: I wonder what the feasibility would be of having an opt-out database for these sorts of things - you know, a publicly facing sort of thing where, for example, if I didn't want my image used for these, I could, ironically, post an image of myself. 

Ben Yelin: Yeah. Unlearn, unlearn. Unlearn my face. 

Dave Bittner: Well, it would require posting an image of myself, which, of course, there's the irony there. 

Ben Yelin: Yes. 

Dave Bittner: But (laughter) - but the systems could routinely check against the opt-out database and say, hey, if you come across this guy, he's out. You know, you do not have permission, you know, or please - I don't know. You get where I'm going with this. 

Ben Yelin: I do. I do. It is funny that you'd have to upload your own photo. It kind of reminds me of the old "Simpsons" joke about the wallet inspector. Like, how did I fall for that, you know? 

Dave Bittner: (Laughter). 

Ben Yelin: If someone was saying they're building an opt-out database, but really, they just wanted to collect your photos... 

Dave Bittner: Right, right. 

Ben Yelin: ...Then that might be a very effective way of doing so, saying... 

Dave Bittner: Yeah. 

Ben Yelin: ...We can protect your privacy if you upload your photo, also your Social Security number and mother's maiden name. 

Dave Bittner: Right (laughter). Yes, absolutely. All right. Well, it's an interesting story, again, from The New York Times. We'll have a link to that in the show notes. 

Dave Bittner: My story this week comes from the folks over at Techdirt, and hat tip to Jason and Brian over at "Grumpy Old Geeks," who brought this to my attention. This is a story about Michigan State Police officials who've been dodging public records obligations by using encrypted messaging apps. It's an article by Tim Cushing, again, over at Techdirt. 

Dave Bittner: It turns out that some folks who had been doing some public records requests for some law enforcement folks, some state police officers in Michigan, noticed that there were way fewer exchanges than one would've expected about a particular topic when they got the stuff back that they had requested. 

Dave Bittner: And it turns out that these officers had been using the Signal app, the, you know, encrypted messaging app - very, very popular. And one of the functions of Signal - not only is it end-to-end encrypted for security, but it also has functionality where messages can auto-delete. After a certain amount of time, they just disappear, and they're gone forever. 

Dave Bittner: And evidently, the police officers were using this app both to keep their interactions out of the public eye, but also to have them disappear in this sort of way. And that goes up against the obligations that they have as public officials to retain these records. There are quite often public records retention rules, and this bumps up against that. What do you think about this, Ben? 

Ben Yelin: This is a great story. It sort of reminds me of the public officials who would hand down decrees on COVID-related restrictions - so, you know, limited indoor dining or, you know, other restrictions - and then they'd be caught at fancy restaurants. You know, so the rules... 

Dave Bittner: (Laughter). 

Ben Yelin: The rules don't apply to them. It is funny that there is such a concerted effort among law enforcement at all levels of government - state, federal - to have a backdoor to get into these encrypted messaging applications for law enforcement purposes. 

Ben Yelin: Now, to be fair, there are a lot of things that, for good reason, we allow law enforcement to do but we don't allow the general public to do. You know, if the law enforcement tackles a criminal suspect who's trying to escape, they're not going to be arrested for, you know, assault or battery. And I think that's reasonable. 

Ben Yelin: But it is sort of interesting that when they have an opportunity to take advantage of encrypted messaging applications, they are doing so, despite their public posture, which is that these applications are dangerous. They're going to inhibit the work of law enforcement. And it's actually going to have a very substantial effect on the case at issue here. 

Ben Yelin: One of - somebody who used to work for the Michigan State Police - he was an inspector by the name of Michael Hahn (ph) - sued the department, basically saying that he was unlawfully terminated. And he went through the discovery process to obtain some text messages that would be used as evidence in his trial, and that evidence isn't available. And, you know, that's going to have a very detrimental impact on his legal case. Potentially, it could be the deciding factor in his legal case. 

Ben Yelin: And once, you know, you've used an encrypted application, especially one where messages are automatically deleted, then you can't issue an injunction to force those messages to come back from, you know... 

Dave Bittner: Right. 

Ben Yelin: ...The alternate universe where they reside. 

Dave Bittner: (Laughter) Right, right. 

Ben Yelin: So he's pretty much just screwed in this situation. 

Ben Yelin: I think the lesson here is there has to be a forced policy change at some of these police departments where, you know, in order to comply with records-keeping laws, like state Freedom of Information Acts, there have to be limits on the use of encrypted applications. I don't think you have to ban them because I think Signal could be useful for law enforcement purposes, even if you're not trying to conceal the trash talk about your colleague. 

Dave Bittner: Right. 

Ben Yelin: I think there needs to be some policies about when it can be used, how it can be used and how to use it while still complying with records retention laws. 

Dave Bittner: How is this different from, say, a police officer - instead of using the public radio channels, you know, someone could listen in on a police scanner. Sometimes you'll hear a police officer say to another police officer, hey, you know, call me on my cellphone - that sort of thing. So we're going to take this conversation out of official channels, and we're going to use our private communications devices to continue this. Is that OK (laughter)? Do you have an issue with that? Or how is this different from that? 

Ben Yelin: No. I mean, I actually think that's a very apt parallel. It is generally OK under most circumstances. I should say not necessarily it's OK, but it usually is not something that leads to the punishment of individual police officers. It might go against department policy, and in many cases, it does. But these policies probably are not strictly enforced. And you'd never find out about two officers using private channels unless it was brought up in litigation... 

Dave Bittner: Right. 

Ben Yelin: ...Which it almost never would be. 

Ben Yelin: You know, what I think is unique about messaging applications is just the fact that you're using an encrypted application. And I guess this is true for your phone call example as well. It means that at least it would seem to a potential litigant on the other side that you were trying to conceal some information because, you know, if you were talking about legitimate police business or if you were talking about something that was completely aboveboard, you would just use Apple's messaging service or whatever. 

Dave Bittner: Right. 

Ben Yelin: So I think it can kind of create that suspicion. So that, you know, kind of makes it a double-edged sword. But I think your metaphor is apt. I think it is like that. I think most times law enforcement chooses not to use official channels, nobody really finds out about it, and there isn't really a lot of accountability. 

Dave Bittner: Yeah. And I guess it's one thing if, you know, one officer's connecting with the other to decide, you know, where they're going to have lunch (laughter), versus, you know, talking about official business, or how are we going to handle this, you know, this interrogation or something like that, right? 

Ben Yelin: Right. Right, exactly. Now, how to handle an interrogation or how to conceal messages from the public that are critical for law enforcement purposes - that potentially would be a legitimate use of an end-to-end encrypted application. And what the spokesperson for the police office at issue here said is, well, we do have policies. These end-to-end encrypted applications can be downloaded for legitimate state business. But then they asked a follow-up question - Techdirt did - to provide examples of what those legitimate purposes would be, and they did not respond. 

Ben Yelin: So I think transparency is important here. If you're going to allow - if you're going to permit end-to-end encrypted applications to be used by law enforcement, there at least has to be some sort of publicly listed policy on when and how it can be used and how you can use it while still complying with records retention rules. 

Ben Yelin: You know, and people take records retention laws very seriously. I remember a prominent politician having a whole scandal about her emails based on the fact that she was potentially violating records retention laws. 

Dave Bittner: Right, right (laughter). 

Ben Yelin: So, you know, it's not something that we can just scoff at and say, you know, Freedom of Information Act - how many cases are actually going to turn on that? I think these are things that are very important, and you can't just kind of have an ad hoc policy on it. 

Dave Bittner: All right, well, again, an interesting article there. It's over on Techdirt, and we'll have a link to that in the show notes. 

Dave Bittner: We would love to hear from you. If you have a question for us, you can send it in. Our email is caveat@thecyberwire.com. You can also call and leave us a message. It's 410-618-3720. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Jenna Waters. She's from an organization called True Digital Security. And she and I looked back on the last year of COVID and how it's affected privacy, particularly on the medical side of things. Here's my conversation with Jenna Waters. 

Jenna Waters: So, unfortunately, I can't say that we're doing all that well, which really isn't surprising. Obviously, this is what I like to call the battle of the goals, essentially. A hospital or a clinic or a, you know, health insurer's primary goal in life is to serve the patients or serve their customers. And that, oftentimes, especially in hospitals, conflicts with cybersecurity. It conflicts with privacy protections. It conflicts with compliance. 

Jenna Waters: Now, HIPAA does a great job in terms of hospitals and health insurers and clearinghouses and helping hold them accountable. But where we find ourselves now is HIPAA kind of predates the current health care evolution into our very digital, very mobile age by - I think we're going on 25 to 30 years. I can't really do the full math in my head right now. 

Jenna Waters: But we're seeing an uptick in ransomware. We're seeing some very serious attacks happening in the telehealth systems. And then we're also kind of seeing this interlude playing between track-and-trace apps of, is it - should we protect people's privacy, or should we track COVID-19? And it's about really striking that balance, and we're having a hard time doing it. And I definitely think that the entire security posture of the health care industry has been really damaged by COVID-19. It will take years for them to recover. 

Dave Bittner: How so? What does that damage look like? 

Jenna Waters: In terms of what it looks like now, for context - and this is probably the easiest way to put it, is in some numbers that are really relatable. In 2021 alone - we're three weeks into 2021, and already, we have about two reports to OCR per week. So that's just those that have reported. So that's over 500 records. That's not reporting any hack or data breach that has occurred in terms of under 500 records or that haven't completed their investigation. 

Jenna Waters: So if we look at that trend and we look at last year's trends, where ransomware - particularly with specific types of Maze and Ryuk resurfacing, we're really seeing a huge increase in the targeting of our health care industry. It's definitely kind of disturbing and scary, and especially when right now, you know, health care workers need all the support they can get because they don't really have time to worry about someone hacking their infrastructure. 

Jenna Waters: If you're found in violation or negligence in terms of a data breach, that can cost anywhere - I believe last year it was, like, fines up to $101.5 million. And that's on the compliance side. On the security side, damage per lost electronic health record is about $400 per electronic health record, per person. 

Dave Bittner: Right. 

Jenna Waters: Yeah. 

Dave Bittner: And it adds up fast. 

Jenna Waters: And it adds up very quickly, especially for hospitals, especially right now, where people are struggling to find beds for people in some states. So they're definitely overrun, and they've had to also lax some enforcement and compliance rules in terms of HIPAA, which I understand. It's what the times call for. But I think we're definitely going to need to see an overhaul in how hospitals are approaching cybersecurity, but also how the industry of cybersecurity can help sort of drive that change and enable them to structure their growth and maturity in a way that also enables them to treat patients. 

Dave Bittner: Now, in your mind, how much of this is unique to the system we have here in the U.S.? Are there other nations who have health care systems that, because of the way they're structured, they are set up to do a better job than what we're able to do here? 

Jenna Waters: Well, I'm not entirely sure, since we tend to be sort of an outlier in the United States. I just simply say it's because our health care system is so complicated. If I look at it in terms of a global scale, I can't give you facts or figures at the moment in terms of that answer because I do... 

Dave Bittner: Right. 

Jenna Waters: ...Primarily focus all of my work within the United States. 

Dave Bittner: Yeah. 

Jenna Waters: But it seems like it is a global issue. It is definitely a global issue. I can really only speak for the U.S. Sorry. 

Dave Bittner: Yeah, that's fine. No, no, that's fine. That's fine. 

Dave Bittner: In terms of HIPAA itself, where are some of the areas that it's sort of, you know, straining under its own weight? What are some of the areas that you think need adjustment? 

Jenna Waters: Well, I mean, it doesn't do a very good job in tackling mobile health records - so mobile devices, IoT devices. It doesn't do a very good job in terms of expanding its breadth of compliance criteria. The way HIPAA works is you have a certain type of data that's ePHI, which I'm sure you know. It covers a specific type of information. But HIPAA only covers, you know, hospitals. So health care entities are covered entities. So that's your hospitals, your clinics, your doctors' offices, you know, that kind of thing. But it was expanded to include business associates. 

Jenna Waters: However, what we define as health data is very, very specific. And what we define as a covered entity and a business associate, again, is very specific. So a good example are the track-and-trace apps that are being developed by Google and by Apple. So they're - I believe they're actually working together on these, which is a new collaboration. Good on them, I suppose. 

(LAUGHTER) 

Jenna Waters: It's questionable whether or not it covers these track-and-trace apps, even if you're reporting in these applications. And I believe some states are using them, some states are using more of a web application. But I'm specifically talking about the mobile applications. 

Jenna Waters: These mobile applications - even if you report that you have COVID, you were diagnosed with COVID, it can be called into question, is that HIPAA data, specifically because they can argue that it's anonymized, it's localized to the device and that kind of thing, which poses its own risks - absolutely, it does - to both privacy and security... 

Dave Bittner: Right. 

Jenna Waters: ...For the individual. 

Jenna Waters: But then you also have the fact that public health departments are not considered by HIPAA to be a covered entity. And the trackers like Fitbits or wearables - those aren't covered, even though they are tracking your health data. So it definitely needs to have a good update in terms of what it covers in terms of industry. 

Jenna Waters: As a matter of fact, I would personally - and this is - or professionally think that instead of defining one specific industry or who is or who is not a covered entity, I think HIPAA should just be industry agnostic. I think it should cover any company or hospital or entity that is collecting health data about the individual because then it can be more easily standardized in terms of how we look at health data and how we protect the privacy of patients. 

Dave Bittner: You know, I'm thinking about, like, something like Facebook that tracks so many things and uses it to target users with advertising and, of course, makes money off of it. I mean, I think there's a lot of misunderstanding out there. You know, if Facebook, for example, makes note that, you know, someone has diabetes or they are in a wheelchair or if they have dandruff, you know, like, these are all potentially tidbits of information that have medical implications to someone. But passing that information around to give me ads for all those things - am I right in thinking that that's not something that would be covered by HIPAA? 

Jenna Waters: So part of it is. That's the actual - a lot of it is not. So you're correct. A lot of it is not covered by HIPAA. What is interesting is - it was 2016. So we ended up passing what's called the 21st Century Cures Act amendments, or amends, which amends HIPAA - the Privacy Rule of HIPAA. It allows HIPAA-covered entities to disclose ePHI for research-related quality, safety and effectiveness of other products and activities. That's really only specific to those that are regulated by the FDA, which I don't think Facebook is regulated by the FDA (laughter). 

Dave Bittner: Right. 

Jenna Waters: So, no, none of that is covered, especially because I think it's - the argument would be that the individual is, quote-unquote, "consenting to that data collection by use" rather than - it's that implicit consent rather than an explicit consent. 

Dave Bittner: Right, right. The good old EULA, right? 

Jenna Waters: Yes (laughter). 

Dave Bittner: That anything I say or share here on this platform may be used to advertise to me, I suppose, right? 

Jenna Waters: Exactly. That's exactly it. 

Dave Bittner: Yeah. 

Jenna Waters: And it's not covered by HIPAA. 

Dave Bittner: Interesting. 

Jenna Waters: So - but I think there needs to be a threshold to where it is. I think that we have to take our health data incredibly seriously. It's definitely one of the only forms of data that an individual has a legal legislated right to privacy towards. And I really think we need to be looking at expanding HIPAA to cover that across industries, as well as kind of shoring up what kind of technology it does and doesn't cover. 

Dave Bittner: Is there any movement in that direction? I mean, where do things stand in terms of political will? 

Jenna Waters: That's a complicated question. I hope that it's going in the right direction. I mean, we've heard, especially recently, a lot of advocacy groups and politicians start talking about data privacy, which is, you know, the first step. 

Jenna Waters: And I think HIPAA is poised - particularly with health data and the gaps that have been kind of discovered or brought to light by COVID-19, I think we could use HIPAA as sort of the launching base towards broader data privacy, again, because we don't have to pass a whole new legislation; we just have to expand an already existing one. And I think there is at least social and cultural will for that to happen. Political will - I am starting to see it, which is hopeful. 

Jenna Waters: However, I think it has to also be noted that multiple political entities do take, you know, donations and do take money and depend upon these social media and data collection companies. And I think that that's going to be a factor in how far we can get in terms of protecting health data or protecting any kind of individual consumer, like, data privacy legislation. 

Jenna Waters: I just think HIPAA is a really good launching point for that conversation because it already exists. It's just continuing to expand the individual right of an individual to have their data be between them and their doctor and not them and Facebook. 

Dave Bittner: It doesn't sound controversial, and yet here we are, right? 

Jenna Waters: (Laughter) Somehow things that aren't supposed to be controversial end up being controversial. 

Dave Bittner: Yeah, yeah. 

Jenna Waters: Hopefully that changes (laughter). I do definitely want to touch on these track-and-trace apps that are occurring, that I'm seeing because there is this idea that they are protecting privacy and security in terms of how they work. So they use Bluetooth technology to kind of ping the person next to you and alert you if that person's been, like, near somebody else who's tested positive. 

Jenna Waters: And, again, without getting into the technical weeds - because I could talk all day about them - unfortunately, Bluetooth is notorious for being very noisy in terms of who and how it talks. It's the person in the room, like, that's bouncing from person to person to person during a party, talking to everybody, you know? 

Dave Bittner: (Laughter) Right. Right, right. 

Jenna Waters: That's Bluetooth. 

Dave Bittner: Right (laughter). 

Jenna Waters: Bluetooth is the person that has to make sure that you know they're there. And that's every device. So I just want to urge people and companies, when they think of implementing these track-and-trace apps or using them for their employees or for themselves as an individual, take that into consideration that Bluetooth isn't very secure - it's actually very wanting in that department - and that even though they're saying the data is localized on the phone, the phone itself is a giant tracking box (laughter). 

Dave Bittner: Right. 

Jenna Waters: Every piece of data that flows through your iPhone or your Android phone your telecom provider knows about. 

Dave Bittner: Right. 

Jenna Waters: So it's just - have that - I think that that's something - again, why, you know, legislation like HIPAA needs to be expanded, as well as the definition of what does and does not count as protected health data, to cover new technologies that are coming out so that we can ensure new technology is meeting what people need. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: A couple of things. First of all, I can't believe we've been doing this for a year. 

Dave Bittner: (Laughter). 

Ben Yelin: It kind of hits you that we're going around in a second cycle of things being canceled. So, you know, last year, the first thing that was canceled was St. Patrick's Day parades, and now we're already hearing about this year's St. Patrick's Day parades being canceled. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: So that part is depressing. I thought it was a really interesting interview just because COVID has presented a lot of novel issues as it relates to data privacy. Part of it is one of the effective ways of containing public health threats is obtaining a lot of information that's potentially HIPAA protected. That's unique to public health emergencies. Usually, somebody else's medical problem isn't any of our business. But when we're talking about something as pervasive as COVID, it is our business because we need a lot of information on who got the disease, who are their close and immediate contacts, you know, what are their demographic characteristics. So I think it's just presented this unique scenario that we have not confronted yet in an age where we've been concerned about data privacy. 

Ben Yelin: The other point that she made that I thought was undercovered and sort of fascinating - when you asked about her optimism about whether we'd have federal legislation on data privacy - was that there are a lot of financial contributors to politicians who would benefit from not having federal data privacy legislation. I think that's something that's really important to remember. I don't think, you know, in all of our conversations about that we've really identified that as a factor. And I think it is a factor. So I'm glad that she brought it up. But it was a really interesting interview. 

Dave Bittner: Yeah. Well, again, our thanks to Jenna Waters for joining us. We do appreciate her taking the time for us. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.