Caveat 4.15.20
Ep 24 | 4.15.20

Where there's a will, there's a loophole.

Transcript

David Holtzman: Consumers are really challenged with understanding where HIPAA protects their information and where it does not. 

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hi, Dave. 

Dave Bittner: On this week's show, I share the details of New York's new data breach notification law and how that may affect businesses. Ben examines the Electronic Frontier Foundation's approach to evaluating government demands for new surveillance powers. And later in the show, my conversation with David Holtzman from CynergisTek. We'll be looking at how HIPAA privacy and security standards may have been impacted by the federal response to the COVID-19 pandemic. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be right back after a word from our sponsors. 

Dave Bittner: And now a few words from our sponsors at KnowBe4. You know, compliance isn't the same thing as security, right? How many times have we all heard that? It's true, too. Having checked the legal and regulatory boxes won't necessarily keep the bad actors out. They are out-of-the-checkbox kinds of thinkers. But what about compliance itself? You've heard of legal exposure and regulatory risk. And trust us, friend, they're not pretty. So, again, what about compliance? We'll hear more on this from KnowBe4 later in the show. It's not a trick question either. 

Dave Bittner: And we are back. Ben, I'm going to kick things off for us this week. I have an article here from The National Law Review, and it's titled "New York's New Data Breach Notification Law: What Businesses Should Know." New York has a new data breach notification law. It's called the SHIELD Act, which, of course, has to stand for something (laughter). 

Ben Yelin: It's an incredible acronym, by the way. I hope you share the full acronym here. 

Dave Bittner: Yes. The acronym is the Stop Hacks and Improve Electronic Data Security Act, which, of course, spells out SHIELD. 

Ben Yelin: Some intern is very, very proud of him or herself for coming up with that one. 

Dave Bittner: So this went into effect on March 21, and it has some changes to New York's data breach notification law. They already had one, but this expands some things. It expands the reach of what's covered. It expands the scope of the previous data breach laws. And it expands it to anyone who owns or licenses the private information of a New York resident. Private information is now defined as including biometric information and username or email addresses in combination with a password or security question and answer that would permit access to an online account. They've redefined the word breach. It's no longer confined to acquisition of information by an unauthorized party, and it now extends to cases where there are indications that the information was viewed, communicated with, used or altered by a person without valid authorization or by an unauthorized person. And then, finally, the law imposes data security requirements that don't specify requirements for protecting information. Instead, the law describes what businesses do to be deemed in compliance. This means that a business has to have a data security program with administrative, technical and physical safeguards, all of which are assessed against a standard of - I love this word - reasonableness (laughter). 

Ben Yelin: Yes, a lawyer's favorite vague term. 

Dave Bittner: (Laughter) Yeah, exactly. I can - I'm just warming up my truck to drive through the loopholes that that word will allow. 

Ben Yelin: Oh, yes. 

Dave Bittner: Interesting note here - a difference between the SHIELD Act and the California Consumer Privacy Act. The New York act does not create a right of private action. So let's go through this, Ben. 

Ben Yelin: Sure. 

Dave Bittner: What is your take here? What do we need to know about this? 

Ben Yelin: So a couple unique things here. The lack of a right of private action is very important. If you live in the state of California, under the CCPA, if a company mishandles your data or otherwise does not comply with that law, you can sue them directly. That is not the case for this New York law. The enforcement is done by the attorney general of the state of New York. So it is a civil enforcement action, but it's not something where it can be initiated by a private party. Another thing that's very interesting about this new statute is that there doesn't actually have to be a breach that has occurred for the state to take administrative action against a business. It's just some indication that they're not complying with best practices. 

Ben Yelin: One thing that sort of triggered in my mind is how in criminal law, we punish people not just for committing crimes but for what are called inchoate crimes, actions that could lead to a crime that don't necessarily lead to a crime. So in other words, we want to sort of punish people, so to speak, for creating the conditions that would lead to a data breach, not just for the data breach itself. 

Ben Yelin: And then, you know, the expansion of the definition of private information I think is very timely. Biometric data really has to be included in these new data privacy laws just because it's become more prevalent. It has the potential to be abused, and it is certainly private information. And then, you know, obviously, the inclusions of passwords, security questions, those are best practices as well. 

Ben Yelin: And, you know, it's interesting that you don't necessarily have to, quote, "do business" in New York. I think that was sort of a narrow requirement of their previous statute. Now it's sufficient if somebody in New York uses your services. And that's going to be true for every single organization or company that does business on the internet basically. 

Dave Bittner: Yeah. 

Ben Yelin: So it just sort of expands that universe, which I think is one of the reasons that this law, along with the California law, will have such an important impact on the entire country. Now all of these companies who have already had to come into compliance with the CCPA, because they're certainly involved with individuals living in New York state, they're going to have to come into compliance with the SHIELD Act as well. 

Dave Bittner: Is this an interstate commerce kind of thing where these states can sort of reach beyond their borders and say, you don't have to be here, but if you're working with people who are here, this applies to you? 

Ben Yelin: Yeah. I mean, so far, courts have not really stepped in on interstate commerce grounds here. If Congress came in and passed a national federal data breach notification law or a data privacy law, Congress could either explicitly preempt state law or implicitly preempt state law. But, of course, Congress hasn't done that. So it really has been left up to the states using what - the powers that they have, their police powers. They're trying to protect the health, safety and welfare of their citizens, and that includes the private information of their citizens. As long as states aren't unduly burdening commerce or as long as they are not unfairly burdening in-state interests over, you know, out-of-state companies, then courts will be reluctant to step in on either commerce clause or its cousin, the dormant commerce clause. And I know that's a lot of legalese. 

Dave Bittner: (Laughter). 

Ben Yelin: But basically, states really do have a lot of latitude here to use the long arm of the law to extend over businesses that may not explicitly want to do business in New York. But New York's a very large state. If, you know, Microsoft decided this law is not something you want to comply with, it's too burdensome, let's stop selling our products to anybody who lives in New York state, that's going to be a very large market share. That's, you know, a big difference between California and New York versus, you know, if North Dakota came up with a data privacy notification law. It might be worth it for a company to be like, you know what? We're only losing 3,000 customers here. Maybe it's worth it. Can't do that with New York. 

Dave Bittner: Yeah. 

Ben Yelin: You're talking about millions of customers there. 

Dave Bittner: Yeah, that's interesting. As we've spoken here before, this patchwork of laws certainly provides a regulatory burden on these businesses who are doing business nationwide and globally. 

Ben Yelin: Yeah. Obviously, they would prefer to have the fewest regulations possible just because it would make their life easier from a compliance perspective. Probably most of them prefer to have one federal standard, as we've talked about, because complying with one standard versus 50 separate standards would be much easier. That would involve the federal government taking action that may put some additional burdens on these companies beyond what California and New York have put on them so far. So that's sort of a dilemma for these companies. I think overall, it would be better for them, even if it's a stringent standard, to just have that one federal standard. And this is sort of - to me, it seems like a pretty obvious use of federal power because we are talking about commerce interactions that span state lines. We're not trying to regulate what goes on within an individual state. These are online transactions of sorts between customers in all 50 states and companies. So I think it's certainly an appropriate place for federal action. 

Dave Bittner: Yeah, it's really interesting insights. Well, that's my story this week. What do you have for us this week, Ben? 

Ben Yelin: My story comes from the Electronic Frontier Foundation. And it's more of an editorial than a story. Of course, the Electronic Frontier Foundation is dedicated to digital privacy. They are some of the foremost experts in digital privacy policy nationwide. And they wrote a piece this past week on how they evaluate government demands for new surveillance powers during the COVID-19 crisis. So throughout this crisis, we've not only heard the private sector getting involved in surveillance but the government being curious about how surveillance can help us get out of this very difficult pandemic. Much of that surveillance is going to be very intrusive. 

Ben Yelin: When we start talking about relaxing some of these mitigation strategies, like school closures, business closures, we need to talk about contact tracing and testing people, finding people who are positive and making sure the people who are positive don't go back out into the general population. And that could involve some pretty stringent surveillance. We could make sure that people are staying in their homes subject to some sort of civil or criminal penalty using location tracking, for example. Now, from the Electronic Frontier's perspective, and I think from a lot of our perspective, any increase in surveillance powers from the government has to be very closely considered. If surveillance goes too far, it not only invades our privacy from their perspective. It can deter free speech. It can unfairly burden vulnerable groups. And I think we've established that in talking about these issues repeatedly. And something that we talked about on last week's podcast, these new policies might seem temporary, but, really, they last forever. A lot of the surveillance apparatus we put into place after 9/11 still exists 19 years later. 

Ben Yelin: So what to do about it? How do we evaluate these government proposals? They've really come up with a three-pronged approach. The first is, has the government shown its surveillance would be effective at solving the problem? This is sort of an underrated aspect to increasing surveillance powers. It really should work. If we're going to invade people's privacy and their potential free speech rights, we better be doing it for a good reason. So if we come up with a testing and tracing scheme that lowers transmission, brings that transmission rate to something that's manageable where we can all go back to work, I can watch sports again, I can finally send my kids back to school... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...And get a little time to myself, then, you know, I'm going to feel a lot more favorably about that surveillance policy. So really, it has to work. That's first and foremost. If we're doing things that aren't actually addressing the problem, that's counterproductive. If the government is able to show that that surveillance method works, then they have to consider whether that surveillance would do too much harm to our freedoms. So for example, if we take on some sort of authoritarian policy where you have to get special government permission to leave your house, you have police officers spying on you via binoculars 24/7, everybody has to wear ankle bracelets, maybe that would work in slowing transmission. But that would harm our freedoms. That would be against our values. 

Ben Yelin: The third - and this is sort of the last prong of the test - is if the new type of surveillance works and if the harm to our freedoms is not excessive, are there sufficient guardrails around the surveillance. One of the most important guardrails is some sort of expiration date. So, you know, we're going to be doing this testing and tracing for a period of three months. It can be extended if the emergency extends. But there is some sort of sunset period. There are a lot of other measures they want to put in place to sort of mitigate the worst effects of surveillance. So consent - people should have the power to decide whether or not to participate in these systems. If we're talking about an app that tracks virus infections, that should be something where users would have to opt in. Minimization - so we've talked about that in other surveillance programs, but making sure that the only data collected is the data that's absolutely essential to effectuate the program. Transparency - the government showing exactly what they're collecting and why they're collecting it. It must not have a disparate impact on certain demographic groups. It shouldn't be biased against racial groups, ethnic groups. It shouldn't be biased on behalf of sexual orientation or identity. 

Dave Bittner: Right. 

Ben Yelin: So those are just some of the safeguards that they want to bring into place if we can come up with a surveillance program that's effective and doesn't infringe on our freedom. I realize that was a very, very long answer there. 

Dave Bittner: (Laughter). 

Ben Yelin: But I'd love to hear your thoughts on it. 

Dave Bittner: Well, I guess, you know, part of this is it is necessary to have temporary emergency measures in place when you're dealing with something like this global pandemic. We're all staying home. I mean, we've got businesses closed. There are no sporting events. You know, all these sorts of things are - only a few months ago, all of us would probably - if you suggested some of these things, all of us would roll our eyes and say, well, that could never happen. 

Ben Yelin: Yeah. 

Dave Bittner: I mean, how could that possibly happen? And here we are. 

Ben Yelin: How could I possibly be home with my kids for 10 weeks, you know. 

Dave Bittner: (Laughter) Right. Exactly. And yet here we are. 

Ben Yelin: Right. 

Dave Bittner: You know, one thing I was thinking about, actually, this morning, when I was out walking the dog and I was pondering this, about - in my mind, the point where we come up with a vaccine, that is going to be a turning point where I think most of us will feel, OK, we can come out of our hiding, and we can go back into public. But how are you going to know who's been vaccinated? Are you going to wear a hat or a patch or - you know, how are you going to verify - if I go to a sporting event, am I going to have to show a card that proves that I've been vaccinated? I can certainly imagine requiring vaccination for kids to go back to school. 

Ben Yelin: Now, we already do that with schools. So in order to enroll your kid in school, you have to make - you know, you have to prove that they've had a measles vaccine, hepatitis vaccine, tetanus, et cetera, et cetera. 

Dave Bittner: Right. 

Ben Yelin: But what you talk about in terms of public gatherings, conferences, sporting events, I mean, that's going to be really difficult to enforce. I think if we've gotten to the point where we have a vaccine, it's going to be like most other communicable diseases, where a lot of it is just going to depend on public trust. The vaccine hopefully will be widely available. If it's like the regular influenza, it might be an annual vaccine that changes as a vaccine mutates. But we're going to have to rely on other people taking the vaccine. 

Ben Yelin: It's a little less of the collective action problem that we have now because if I vaccinate myself and my kids and, you know, my family, then at least I'm protected. But whereas right now, if I choose not to do that, I'm potentially infecting other people. But, yeah, I mean, I don't know if we're ever really going to get to that point where you have to, you know, show a card on your forehead to enter any sort of conference... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Or sporting event. I'm not sure we're going to get there. 

Dave Bittner: Yeah. 

Ben Yelin: But I think the point you make is well taken. 

Dave Bittner: Yeah. And this notion that the EFF brings up about guardrails and things expiring - you know, as you mentioned, there's still things in place from 9/11. It's hard to claw these things back. I'm reminded of a - related to this, you know, when NSA was saying, hey, we've got these surveillance powers. We don't want them anymore. They don't work. 

Ben Yelin: Right. 

Dave Bittner: It's more trouble than it's worth. Can we please not have these anymore? And you still had folks in Congress saying, no, you know, we really think you should still have those. You know, we know best. 

Ben Yelin: Power is a drug. I mean, you give somebody power, you give somebody surveillance powers, and the government determines that it's an effective tool, they are not going to want to give it up, you know... 

Dave Bittner: Yeah. 

Ben Yelin: ...Especially as we worry about future pandemic events. I mean, we've seen the impact that this has had on not only our public health but on our economy. You know, so if you're a government official - maybe you're not the president, but you're somebody who works at a lesser position in the federal government - you're thinking, well, maybe I don't want this - these surveillance tools to expire because, in three years, we could have a similar pandemic, and I don't want to go back to negative 30% GDP growth and 20% unemployment. 

Dave Bittner: Right. 

Ben Yelin: So I'd rather have this surveillance tool. And that's the same justification we see with counterterrorism. Yeah, maybe the NSA doesn't want this, but I don't want to see another 9/11... 

Dave Bittner: Yeah. 

Ben Yelin: ...So we need to make sure that these powers are still in place. You know, that's why I think what EFF is doing is very productive because you can have that conversation now before these practices are put into place. You know, you can really set your own framework on how we evaluate new surveillance policies. And there are going to be new policies that come out. I mean, I think the only way we're going to get back to some sort of semblance of normal is by having invasive testing and contact tracing, and that is going to involve surveillance, electronic or otherwise. So it's going to happen. But it's good to have a framework where we can evaluate the pluses and minuses of whatever that surveillance method is. 

Dave Bittner: Yeah. 

Ben Yelin: And so I think this piece here was very useful for that purpose. 

Dave Bittner: All right. Well, it's an interesting story. Of course, we'll have links to all of our stories, as we always do, in our show notes. It is time to move on to our Listener on the Line. 

0:18:14:(SOUNDBITE OF DIALING PHONE) 

Dave Bittner: Our Listener on the Line wrote in their question. This is a listener named John (ph). And he says, here's a question for Ben. Why would a CCPA suit be brought in a federal court as opposed to a California court? And this listener also sent in an article. This is from JD Supra, and it's titled "The First Ever CCPA Cause of Action is Filed in a Federal Court, But is This Class Claim Short-Lived?" Without getting too much into the weeds here, Ben - if possible (laughter) - what's going on here? 

Ben Yelin: I'll take a breath so I can stop myself from boring you with civil procedure. 

Dave Bittner: (Laughter). 

Ben Yelin: But I will explain this as briefly as I possibly can 'cause it really is boring. 

Dave Bittner: OK. Oh, good. Way to sell it, Ben. Way to sell it (laughter). 

Ben Yelin: I know. I know. I really - there's some things I can sell; civil procedure is not one of them. There are only a few ways a case can make it into federal court. One of the ways is if it's a case that arises out of either a federal law or the U.S. Constitution. So, you know, if I sue somebody else for violating my Fourth Amendment rights, that could make it into federal court because it's a federal claim. Then there are certain areas that the federal government has decided that it has full dominion over in the legal realm. So things like patents, that's a federal power. All patent cases are federal cases. 

Ben Yelin: The third category are what are called diversity cases. So if a person or an entity from one state sues a person or an entity from another state and the amount in controversy is above a certain threshold - I believe right now it's $75,000 - then that case can be heard in federal court. The original rationale behind that is you didn't want to give one of the parties sort of a home court advantage, if you will, to use a terrible pun. So, you know, I think that meant more in the 19th century when states were more, like, self-contained entities and cared about protecting their own interests versus the interests of people in other states. So you wouldn't want somebody to sue you and be able to have the case heard in their home state court. 

Ben Yelin: So as a result, we have this diversity jurisdiction rule, and that's exactly what happened here. The plaintiff is from a different state than the defendant. The amount in controversy or the amount that's being pleaded by the plaintiff against the defendant meets that threshold, and therefore, the case can be heard in federal court, even though the case concerns a state statute. I could get into some other really complicated stuff. For any of you lawyers out there who know what the Erie doctrine is, I will just give you a wink and a nod there and say that's something I would talk about if we had more time. 

Dave Bittner: All right. Fair enough. Well, moving on. Thanks to our listener, John, for sending in that question. And, of course, we would love to hear from you. We have a call-in number. It's 410-618-3720. That's 410-618-3720. You can call and leave your question. You can also send us an audio file to caveat@thecyberwire.com. And you can also email us your question as well. Coming up next, my conversation with David Holtzman from CynergisTek. We'll be looking at how HIPAA privacy and security standards may have been impacted by the federal response to the COVID-19 pandemic. 

Dave Bittner: But first, a word from our sponsors. And now back to that question we asked earlier about compliance. You know, compliance isn't security, but complying does bring a security all its own. Consider this. We've all heard of GDPR, whether we're in Europe or not. We all know HIPAA, especially if we're involved in health care. Federal contractors know about FedRAMP. And what are they up to in California with the Consumer Privacy Act? You may not be interested in Sacramento, but Sacramento is interested in you. It's a lot to keep track of no matter how small or how large your organization is. And if you run afoul of the wrong requirement, well, it's not pretty. Regulatory risk can be like being gobbled to pieces by wolves or nibbled to death by ducks. Neither is a good way to go. KnowBe4's KCM platform has a compliance module that addresses in a nicely automated way the many requirements every organization has to address. And KCM enables you to do it at half the cost in half the time. So don't throw yourselves to the wolves and don't be nibbled to death by ducks. Check out KnowBe4's KCM platform. Go to kb4.com/kcm. Check it out. That's kb4.com/kcm. And we thank KnowBe4 for sponsoring our show. 

Dave Bittner: And we are back. Ben, I recently had the pleasure of speaking with David Holtzman. He works for a company called CynergisTek. Prior to that, he was a former senior adviser for HIT in the HHS Office for Civil Rights. He served eight years as a leader in the development and enforcement of HIPAA Privacy, Security and Breach Notification Rules. So he's a guy who's been in the thick of all this stuff. And we talk about HIPAA privacy, their security standards there and how things may have been impacted by the federal response to COVID-19. Here's my conversation with David Holtzman. 

David Holtzman: I've been very fortunate to have been introduced to the HIPAA Privacy Rules back in their infancy working in the private sector and implementing them and then was able to move over to the Office for Civil Rights and worked on the very small team that was responsible for developing and implementing the HIPAA Privacy, Security and then the Breach Notification Rule. And I was very fortunate to have been leading the effort to integrate the HIPAA Security Rule from its infancy when it was developed by what is now called CMS and brought it over and consolidated it into the Office for Civil Rights so that there would be some coordinated development of policy and enforcement along with the HIPAA Privacy Rule. 

Dave Bittner: Now, it's my understanding or my perception that, in general, HIPAA has been considered to be a real success when it comes to protecting people's privacy. First of all, is that an accurate perception, and where do things stand today when it comes to HIPAA? 

David Holtzman: It's important to remember that the approach that we have taken in the U.S. is a sector-based approach. In other words, unlike in Europe and many other first-world countries, individuals here in the United States don't enjoy a guaranteed or principled right to privacy. The way it's structured here in the U.S. is it's - individuals have privacy where the federal government and the states have created a right to privacy. HIPAA has been very effective in providing Americans with some basic level of the ability to have confidence that their health information only be used and disclosed for certain purposes and that they have the ability to have access to that information and to correct it and to have some level of control over how that information is shared. But the challenge is that the congressional mandate to develop the HIPAA Privacy, Security and Breach Notification Rules, Congress, in allowing HHS to develop regulations in that area, strictly limited the scope to whom the protections would apply. So as we have witnessed the revolution in technology, many of the tools and applications and consumer-level advances that we have seen are outside of HIPAA, and consumers are really challenged with understanding where HIPAA protects their information and where it does not. So for health insurers, health care providers and organizations that are contractors or vendors to these organizations that participate in the health care industry, HIPAA has been very effective in setting ground rules as to how the information may be used or disclosed and giving individuals rights to both know what information is collected about them and how it's being used. 

Dave Bittner: As you and I speak here, you are in the midst of the COVID-19 pandemic. How does HIPAA intersect with the reality of that situation? 

David Holtzman: Well, I think it's important to remember that HIPAA has always permitted great latitude and flexibility to allow for information to be shared for the purposes of someone's health care treatment and how an individual's health care status may impact a third party. So for example, the prime directive of the HIPAA Privacy Rule is that it will not interfere or stand in the way of an individual getting treatment or stand in the way of health care providers sharing information about an individual for their treatment, even if the individual isn't aware of an indirect contribution by a health care provider. 

David Holtzman: In addition, HIPAA allows great flexibility for the sharing of information with public health authorities. And in times of crisis, when there is a public health emergency or there's a natural disaster or other extreme emergency which requires the sharing of information, or in the best interest of the health care professional that is treating the individual, it's in the best interest to share that information - friends, family and others who may be able to help or assist in the treatment of that individual. 

Dave Bittner: You know, I'm seeing a lot of doctors switching over to using online systems for health care - telehealth, things like that - remotely working with their patients using video conferences, those sorts of things. Are there any specific challenges or opportunities that come with the shift to those systems? 

David Holtzman: Well, first of all, I think it's important to recognize that these types of commonly available telecommunication services, whether they be video conferencing or text messaging, they have not met the HIPAA guidelines that had been in place. And so to ease the regulatory burden in this time of extreme emergency, the federal government has relaxed standards - or actually, the way that OCR has put it - the Office for Civil Rights has put it - they are using their discretion to not enforce any of the HIPAA Privacy, Security or Breach Notification standards against any health care provider who uses some type of telecommunications technology in order to provide treatment services to a patient. No longer do health care providers have to check to see whether or not the technology that they want to use meets the privacy and security standards that were set out in the HIPAA Security Rule, nor do health care providers have to be concerned that they would be subject to OCR levying a fine or penalty if, for example, there was a cybersecurity incident which interrupted or interfered with the transmission or that the confidentiality of the information was compromised in the performance of using these technologies. 

David Holtzman: OCR made clear that while many of the commonly used technologies can be applied in this space - so, for example, technologies like WhatsApp video, Apple FaceTime or Google Hangouts video - the one thing in common is that they are nonpublic-facing. In other words, OCR seeks to maintain a modicum of privacy and security so that the treatment session between the health care provider and the individual is easily accessible and keeps the patient out of the physician's or other health care provider's office but also provides some level of privacy because the session, the telehealth session, is only between the health care provider and the patient. What OCR has designated is not acceptable or has recognized as public-facing types of technologies are applications like TikTok or Facebook Messenger live or Slack for messaging because those technologies produce our sessions that are publicly accessible and don't preserve the privacy of the individual while providing the telehealth service. 

Dave Bittner: I see. What sort of changes do you think we might see on the other side of this? Is it reasonable to expect that the things that we'll learn going through this pandemic - that those lessons will be applied to how we approach privacy after it's settled down? 

David Holtzman: I think it's fascinating that in the months leading up to the pandemic and the flexibility that has been provided by regulatory agencies to recognize that it's good for society and good for individual health that there be more accessible and flexible approach to using technologies. I think only time will tell to see where we are after the emergency has passed. Will regulators and consumers accept that the current HIPAA security standards are outmoded and really don't meet the needs of consumers and how health care is delivered here in the 21st century, or will there also be concerns that because there is just this general lack of oversight and regulation over these technologies that there needs to be innovation or additional protections and requirements over all information that's collected from consumers instead of the sector approach that we have been living under for the past 20, 25 years? So I think time will tell, and I think the question has yet to be settled. 

David Holtzman: I will point out, however, that the federal government is not the only player in this space. Almost half the states have some type of data protection or information security requirements that either are layered on top of the HIPAA requirements or that exist independently and are not preempted by the changes in federal policy. In addition, under the HITECH Act that was passed in our last economic collapse, the state attorney generals were also given authority to levy fines and penalties for failing to comply with the HIPAA Privacy, Security and Breach Notification Rules. So it's going to be an interesting dynamic to see how the states respond also to the changes and any changes to attitude about individuals' privacy and in the ability to assure the confidentiality, integrity and availability of the information that is created and maintained through the use of these common technologies that are available both to the consumer and to the health care provider. 

Dave Bittner: All right. Interesting conversation. Ben, what do you think? 

Ben Yelin: Yeah, very interesting. I think all of us have sort of put data privacy, personal health information privacy on the back burner because we've been justifiably desperate to get people medical attention. You know, especially since nonessential businesses are closed, a lot of people can't see their health care provider. If it's an emergency, sure. But in other circumstances, we are resorting to things like telehealth. And it's just very interesting to know that both the federal government, through this Office of Civil Rights, which he discussed, and most state governments have really waived some of the privacy regulations that come with telehealth appointments. 

Ben Yelin: Again, this goes back to the theme of what we previously talked about on this podcast. I think all of us would take that trade-off right now, sacrificing some of the more stringent privacy measures so that we can have these basic appointments via telehealth. But when this emergency is over, we want to see those regulations put back into place robustly. And, you know, the one concern is that we're opening Pandora's box by relaxing these regulations during an emergency and we're not going to be able to restuff the box once the emergency is over. 

Dave Bittner: Yeah. I think it's interesting, too, sort of coming at it from the other direction, that this will be an interesting test case when we get on the other side of this to see if this relaxation of the rules really had a negative effect on privacy - being able to use telehealth, being able to email things to doctors, you know, loosening up some of these things. I suppose we'll have a good amount of data to be able to look at and say, is it on balance a good idea to be able to still do some of these things? As people have gotten used to it, are they perhaps going to push back and not want to go back to the old way? 

Ben Yelin: Yeah. I mean, that's a really good point. If we find out that every doctor's appointment is getting Zoombombed, then, you know, maybe we'd say, we can't do this. We... 

Dave Bittner: Right. 

Ben Yelin: ...Need to - even during an emergency, the risk to privacy is too severe. We need to keep these stringent protections in place. If that doesn't happen and we don't see a lot of data breaches and this does not become problematic from a privacy perspective, telehealth and being able to email directly with one's doctor - those are major increases in convenience for the average health consumer that, you know, consumers might not want to give up after this emergency is done for. So, as you say, this could be sort of an interesting experiment - you know, an inadvertent experiment in how we deal with HIPAA in the context of these online interactions. 

Dave Bittner: Yeah. Well, our thanks to David Holtzman from CynergisTek for joining us. Certainly an interesting conversation. We appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. 

Dave Bittner: And, of course, we want to thank this week's sponsor, KnowBe4. If you go to kb4.com/kcm, you can check out their innovative GRC Platform. That's kb4.com/kcm. Request a demo and see how you can get audits done at half the cost in half the time. Our thanks to the University of Maryland Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.