HITECH Act: Pushing the healthcare industry toward electronic medical records.
Donna Grindle: You're really not even 10 years down the path of the majority of health care information being electronic.
Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: On this week's show, Ben takes a closer look at video surveillance and privacy issues being argued in front of the First Circuit Court. I look at Facebook's spotty record when it comes to reining in their algorithms. And later in the show, the return of Donna Grindle, CEO at Kardon and co-host of the "Help Me With HIPAA" podcast.
Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: All right, Ben, we've got some really interesting stories to share this week. Why don't you start things off for us?
Ben Yelin: I have a fascinating case, comes to me via a friend of the pod in my mind, Orin Kerr - Professor Orin Kerr.
Dave Bittner: (Laughter).
Ben Yelin: We'd love to make him an actual friend of the pod, but...
Dave Bittner: In your fantasies, he's a regular listener and (laughter)...
Ben Yelin: My unrequited love.
Dave Bittner: Yeah, OK. Very good.
Ben Yelin: But he alerted me to a case in the First Circuit, which is the Court of Appeals in the northeast United States. And this is about the installation of a secret video camera outside somebody's home. So it was posted on a utility pole unbeknownst to the people who lived in that house.
Ben Yelin: So there is one former official magistrate, so somebody who had been in the judicial system - her and her two children were basically running a drug trafficking operation and had taken trips back-and-forth from Vermont to - they lived in Massachusetts - to exchange money for drugs.
Ben Yelin: And so the police became suspicious. They installed this whole security camera. These individuals were caught, and they were prosecuted. And they are challenging their prosecution on Fourth Amendment grounds, basically saying this is an unconstitutional search because they have a reasonable expectation of privacy in their own home.
Ben Yelin: So this raises a bunch of really complicated issues that don't really lend themselves to having an easy answer. First things first, when we talk about any Fourth Amendment case, is this a search? We know it's a search if the person had a reasonable expectation of privacy.
Ben Yelin: So you do have a reasonable expectation of privacy inside your home. And certainly, to some extent, you know, around the curtilage of your house, you have a reasonable expectation of privacy. But when you're outside, you really don't.
Ben Yelin: And this camera wasn't peering into the windows. You know, it wasn't infrared technology looking at whether they're using heating lamps inside the house. It really was just looking at the outside of the house the way any snooping neighbor could. So seemingly, that doesn't really count as a search.
Ben Yelin: You know, if I went out on my front lawn and did a bunch of illegal things and my neighbor saw me, I'm not really exhibiting much of a subjective expectation of privacy, let alone one that's reasonable. And so that seems to be what the government is arguing here.
Dave Bittner: Let me pause you for a second here. Did they have to get a warrant to put that camera on the pole?
Ben Yelin: They did not. So that's the issue here.
Dave Bittner: OK. OK.
Ben Yelin: If they had obtained a warrant, this would be a reasonable search or - yeah, this would be reasonable because they would be admitting that it was a search, but you can search and have it be reasonable if a warrant has been obtained.
Dave Bittner: I see.
Ben Yelin: So no warrant was issued in this case.
Dave Bittner: OK.
Ben Yelin: The defense tried to invoke Carpenter v. United States, which I know we've talked about many times on this podcast. And that case stands for the proposition that you have a reasonable expectation of privacy in the whole of your movements. So the reason that cell site location information is protected under the Fourth Amendment and qualifies as a search under the Fourth Amendment is that it reveals somebody's movements over a long period of time.
Ben Yelin: I think what the prosecution is saying here is that's not the issue in this case. We're not following this individual from location to location. We're not tracking their location. This is a fixed camera that is perched outside their home. So that doesn't really qualify. It doesn't really relate to the Carpenter case.
Dave Bittner: But it tracks their comings and goings from that place - right? - cars in the driveway leaving and arriving.
Ben Yelin: It does, but, you know, that doesn't relate to the whole of their movements. They don't know where the person is going.
Dave Bittner: OK.
Ben Yelin: They don't know whether they're going to sell drugs or, you know, going to day care or the grocery store.
Dave Bittner: Right.
Ben Yelin: So it just doesn't quite invoke that same suspicion that we have in the Carpenter case, where they were collecting somebody's historical cell site information over a period of weeks and months.
Ben Yelin: So the other argument here relates to what we call the mosaic theory, which basically means maybe it's not a search technically every single time that camera catches something. You know, if you were to hang it up for one hour on one day, maybe that does not qualify as a search. But if it's constantly monitoring your house and monitors you for an extended period of time, perhaps law enforcement could start to put together a mosaic of your life, put those pieces together. And that mosaic in and of itself would qualify as a Fourth Amendment search.
Ben Yelin: The mosaic theory is not exactly favored among many jurists. It is a theory. It's not one that's been widely adopted, although Supreme Court justices have hinted at thinking favorably about it. But it's never been kind of formally adopted.
Ben Yelin: But to me, that seems like a bit of a stretch in this case because we don't really know where to draw the line. At what point do we have a mosaic? Is it reviewing this data over three days, over seven days, over a month? If you're going to use that theory, it just becomes really hard to adjudicate because we don't know where those dividing lines are.
Ben Yelin: So basically, we've gone through a lot of constitutional theories of what amounts to a Fourth Amendment search, and we're still kind of left without a satisfying answer, which is why I'm just very curious about what the court is going to do here, whether they're going to kind of formulate a new rule that relates very narrowly to the circumstances in this case. And I think they might.
Ben Yelin: Some analysts have said this is somewhat analogous to blue light cameras in neighborhoods or any sort of CCTV where you're monitoring activity on the street. I think this is a little bit different because this is perched directly outside somebody's house and is laser-focused on this house. So maybe the court will decide because of that, because of the fact that it's aimed at one individual house, maybe that qualifies as a search because somebody should have a reasonable expectation of privacy, at least over the long run, while they are outside of their own home. But I don't know.
Ben Yelin: I mean, I just think it's a really tough case. And I'm very curious to see how they wrestle with these very difficult issues.
Dave Bittner: One of the things that caught my eye here in this description of the case, the back-and-forth between the judges and the lawyers, was they asked about a fence. You know, if you have a fence around your property, that may change what is a reasonable expectation of privacy. In other words, if I build a fence for privacy - I put an 8-foot fence around my home - it's different if - so now someone walking by on the street can't see inside my front yard, but, you know, they could get a ladder, right?
Ben Yelin: The lawyer here did his or her research. I can't remember what the name of the lawyer was, but the lawyer was like, well, technically it's against, like, local code to build fences that are that high...
Dave Bittner: Yeah.
Ben Yelin: ...That would reach as high as that camera. So that's not really a possibility in this case.
Dave Bittner: Right. But that's where I'm going here - is because I think it was one of the judges who said, like, this camera is 20 feet up on a pole. You know, how high a fence would you have to build? It seemed to me like they're asking, are we taking away the individual's ability to activate privacy of their property, of their lawn, you know (laughter), rather than inside the building itself? You know, is it possible? And as someone else pointed out in the back-and-forth here, I suppose it's been found legal that they could park a helicopter above your house and look down on you that way, and that's fair game.
Ben Yelin: Yeah. I mean, there are cases that indicate that if a small plane or a helicopter is flying at a relatively low altitude that's accessible by the public, then that does not qualify as a Fourth Amendment search, which might seem kind of counterintuitive to most people who think that they have privacy protection in their own backyard, say. Part of it is, as you say, how do you exhibit your subjective expectation of privacy in this scenario?
Ben Yelin: They brought up a bunch of hypotheticals here. What about a no trespassing sign? Well, that doesn't really do anything because I don't know about you, but when I see a no trespassing sign, I can still look on that property. I probably am more likely to look as a neighbor into that property if they have a no trespassing sign.
Ben Yelin: And you can't build a fence, as you say, so what are the options if a person really does want to conceal themselves on their own property? And so without giving these potential defendants the right to exhibit that subjective expectation of privacy, it seems unfair that they've kind of automatically forfeited that expectation.
Dave Bittner: Yeah. I wonder, too, in this case - like, suppose the camera was looking at the street at the end of the driveway, you know, in other words, just seeing what was coming in and out of that driveway, not looking at the property itself - the public space outside the driveway. Would that be a distinction without a difference where I can still see the vehicles that are coming and going, the people that are coming and going? I can read license plates, you know, that sort of thing. I suppose I'm not going to be able to see what people are taking in and out of their vehicles or in their trunks or things like that.
Dave Bittner: But, you know, how does that move the bar if we are in agreed-upon - a place that we all agree is a public place, the street in front of the house, versus the driveway that is someone's private property? Does it matter?
Ben Yelin: It seems to me that that would matter. In fact, one of the judges here, you know, at least mentioned a possible distinction - is that when you're talking about any sort of CCTV or cameras posted in more public places, maybe on a streetlight that's not directly, you know, peering into somebody's house, maybe that's a distinction from what we have here, which is targeted surveillance. It's looking specifically at one house, one family, one doorway.
Ben Yelin: So that might actually be a worthy Fourth Amendment distinction and sort of the difference of being the target of the search and being caught up in sort of a dragnet. And for better or worse, when you are caught in a dragnet, you are less likely to be able to assert your Fourth Amendment rights than if you are specifically targeted for a search. And that might be the distinction that the First Circuit draws here.
Ben Yelin: I mean, to give you an idea as to how complicated this case is, the district court ruled one way, and then there was a three-judge panel on this appeals court that ruled the opposite way. And it was controversial enough that the full First Circuit is hearing this case en banc, meaning sitting as one court. And that's relatively unusual. I mean, you get several en banc cases per year, but it means that there's really a difficult dispute here, and these questions are not easily answered.
Dave Bittner: Is this the kind of thing that could find its way to the Supreme Court?
Ben Yelin: I really think so. I think this is the type of thing that could have a glide path to the Supreme Court. Usually before we get a case like this coming to the Supreme Court, you might have to see some disagreements among circuits.
Ben Yelin: I could anticipate if the First Circuit kind of developed its own rule here that's very limited to the specific circumstance of a pole affixed to a camera targeted at one individual house, you know, maybe the Supreme Court would want to wait and see if there were any other cases across the country that address that very narrow question before they decided to rule on it. But I think it's certainly the type of case that we could possibly see in front of the Supreme Court.
Dave Bittner: All right. Well, time will tell. It's the kind of case you and I love to hash over, isn't it (laughter)?
Ben Yelin: It really is. I mean, I know we like to focus on modern technology, and this isn't quite as modern. But it's one of those things where today, it's video cameras. Then they're going to figure out some other, more sophisticated, modern way to peer into somebody's house...
Dave Bittner: Yeah.
Ben Yelin: ...Or to violate a person's expectation of privacy. I think it just kind of behooves us to follow where - you know, where these cases go and where these lines are drawn.
Dave Bittner: Well, and video cameras have greater capabilities than they used to, thanks to facial recognition and being able to read license plates and all those sorts of things. They've cranked up their own capabilities.
Ben Yelin: Absolutely.
Dave Bittner: So I think that's an issue, as well.
Ben Yelin: Absolutely.
Dave Bittner: All right. Well, that's a good story. My story this week comes from the MIT Technology Review, a story by Karen Hao. And it's titled "How Facebook Got Addicted to Spreading Misinformation." A bit of a long read here, but I highly recommend this article - really worth your time if you're interested in this sort of broad issue that we've been talking about with Facebook and engagement and certainly after the last election cycle and the riot at the Capitol Building in Washington, D.C. - all play into this thing here.
Dave Bittner: And really, what this story does is it follows a bit of the history of how Facebook has approached its algorithms, how it developed its algorithms, how they started out really on the advertising side of the house and then moved over to engagement. And the bottom line is that it seems as though Facebook and CEO Mark Zuckerberg in particular - they prioritize growth over everything else. And so if the folks within Facebook who are working on artificial intelligence - if they come up with something that, you know, may do a better job at not spreading misinformation or amplifying divisions in our society - if that slows down growth, chances are it's going to meet resistance from the highest levels of the company. And it seems to have happened time and time again throughout the company's history here.
Dave Bittner: What caught your eye with this article, Ben?
Ben Yelin: First of all, it is a long read, but I highly recommend it. It goes into the history of - this individual, I think, has a real sense of, if not guilt or regret, just kind of unease about developing this monster. I don't know if you've - were forced to read the "Frankenstein," you know, the original Mary Shelley "Frankenstein" in high school or college.
Dave Bittner: Yes (laughter).
Ben Yelin: But that's really what I think of here. And we have our Dr. Frankenstein, who designed this monster that's really out of control because the algorithm has allowed Facebook to be extraordinarily profitable. They dominate the market for social media. They have billions of users. It's one of the most successful companies in the world.
Ben Yelin: So when you have that financial advantage, why would you voluntarily peel back, you know, this formula that's given you so much financial success? And I think it's kind of difficult for any of us to weigh those ethical concerns against, you know, what is a very successful business model. But I think all corporations to some extent have a kind of a public responsibility, even if it's not explicit. If you have the capability of building something that's dangerous to the public, even if it would be profitable and even if it is technically legal, I think you have a moral or ethical obligation to try and rein in its excesses.
Ben Yelin: And I think there are no excuses for Facebook anymore because now we know, based on Cambridge Analytica and what's happened in the past two election cycles, that their formula does encourage the sharing of false information or information that might incite violence. So at this point, they've sort of lost the excuse they might have had eight to 10 years ago where there wasn't really any evidence that it was the algorithm that was leading to these bad outcomes. Now I think we have enough evidence to show that it does.
Ben Yelin: And so I think Facebook, whether they have a legal responsibility or not - and they're being sued now but largely for anti-competitive practices and not as much for this. But you just get that question of at what point do you have an ethical or moral responsibility? And those of us who are Facebook's consumers - you know, I'm like the rest of human beings; I want to see what my high school friends are up to - do we have a responsibility to stop feeding into something that can be incredibly destructive?
Dave Bittner: Yeah.
Ben Yelin: So is it incumbent upon us to deactivate our accounts until Facebook starts to make changes to the algorithm? It's good that they've developed somewhat of a conscience. This article talks about cutting against algorithmic discrimination. That's a narrower issue. It's good that they're addressing that. But I think Facebook will sort of have this moral scourge until they are able to stop the relentless spread of misinformation.
Dave Bittner: Yeah. And I want to dig into that. I mean, people within the organization, within Facebook's organization, their bonuses are based on growth. And, you know, if you want to tell someone what's important to you, if you want to tell an employee what your priorities are, doesn't matter what you say to them. How you pay them is (laughter) - you know, that's going to tell them the real story, right?
Ben Yelin: Right.
Dave Bittner: And in this case, growth and engagement, again, it seems to be above all else. And this article talks about back in August of 2018, which was the ramp-up to the U.S. midterms...
Ben Yelin: Seems so long ago, doesn't it?
Dave Bittner: (Laughter) And yet so close.
Ben Yelin: Yes.
Dave Bittner: And President Trump and many other leaders in the Republican Party were accusing social media giants, including Facebook, of anti-conservative bias. And they said that Facebook's moderators were applying community standards which were suppressing conservative voices more than liberal ones.
Dave Bittner: Now, those charges were debunked, but the platforms and Facebook were affected by this. They did not want to be seen as being biased against one political side, especially in such a divided nation as we are right now and with a powerful president like President Trump was, who's not afraid of going out in public and expressing his opinions with his great influence...
Ben Yelin: Yes.
Dave Bittner: ...With his large audience.
Dave Bittner: So here's the thing that really struck me in light of all that is this story points out about how Facebook approached fairness, this notion of fairness, fairness between political sides. And I was trying to think of a good analogy here. So let's say, for example, Ben, that you are a really good driver, right?
Ben Yelin: I'm there. Yup.
Dave Bittner: You're an excellent driver. You're courteous. You always use your signals. You know, you let the other person go first at a four-way stop, all of those good things.
Ben Yelin: My wife might argue otherwise, but sure.
Dave Bittner: (Laughter) And let's say that I am a terrible driver, right? I run red lights all the time. I've never seen a stop sign that I didn't just smile and wave at as I drove right through it, right? I drive too fast - drive fast and I take chances, OK?
Dave Bittner: So when we approach how to treat the two of us fairly, one approach would be that we come up with a set of rules, right? - speed limits, stop signs, stoplights, all those sorts of things, reckless driving - and we apply those rules to each of us fairly. And we would say - so in that case, chances are I would receive a lot more attention than you would because I'm breaking the rules a lot more, right?
Ben Yelin: That is correct.
Dave Bittner: Well...
Ben Yelin: Yes.
Dave Bittner: That's one way to approach the phrase fairly. Another way would be that it would be fair if you and I received the same number of summons, the same number of tickets. That would be fair, right?
Dave Bittner: That is how Facebook approached the bias argument. Rather than establishing a set of standards and using that set of standards consistently, they were very deliberate about not applying more pressure to one side or the other even if one side was much more responsible for disinformation or misinformation. And that, to me, is fascinating.
Ben Yelin: Yeah. It is fascinating, to put it mildly. It's an interesting definition of fairness. And when...
Dave Bittner: Right (laughter).
Ben Yelin: ...You use that metaphor, you can kind of see how it starts to get a little bit ridiculous. I think Facebook and other tech platforms initially tried to do what you said, which is come up with universally applicable rules that would apply to all users.
Dave Bittner: Right.
Ben Yelin: It just turns out that when you apply those rules, certain political actors are punished more than others. I think it was Twitter that realized that when it started to use an algorithm to filter out hateful content, it started to filter out, like, some mainstream conservative activists, commentators, et cetera. And as a result, they had to retract that formula and not use it.
Ben Yelin: So, yeah, I mean, I think it is kind of a backward definition of fairness. You know, I think most of us would agree that you should set standards. Those standards should be very clear and transparent. You should prepare for a bunch of different scenarios that might emerge from those standards and apply the rules accordingly.
Ben Yelin: And when you have things like Facebook content leading to potential genocide overseas or, you know, leading to an insurrection at the Capitol, you should be thinking less about how many actors in a - on a certain political side are getting silenced, and you should think more about whether your platform - whether its rules are being violated in a way that's causing harm to a significant number of people.
Ben Yelin: That would be the ethical consideration for me. Now, I understand why it might not be the ethical consideration for Facebook. Their formula, as I said, makes them a lot of money. They also are very sensitive, I think, to conservative criticism that tech companies are biased against conservatives. And, you know, I think they're going to make decisions accordingly because, as you said, in a divided country, you don't want 50% of the people pissed off at you...
Dave Bittner: Right.
Ben Yelin: ...A hundred percent of the time.
Dave Bittner: Yeah. All right. Well, again, a highly recommended read from both of us. This is over at the MIT Technology Review. It's titled "How Facebook Got Addicted to Spreading Misinformation." We'll have a link to that in the show notes.
Dave Bittner: We would love to hear from you. If you have a question for us, you can call in. Our number is 410-618-3720. Or you can email us. It's email@example.com.
Dave Bittner: Ben, I don't know about you, but I am excited, pleased as punch to have Donna Grindle back on the show. She is, of course, one of our favorite guests. I am renewing my membership in the Donna Grindle fan club.
Ben Yelin: Yeah, I think I'm going to order a new card as well. I'm a card-carrying member.
Dave Bittner: (Laughter) Right. Pretty soon, she's going to have to start selling T-shirts.
Ben Yelin: Yeah.
Dave Bittner: She is co-host of the "Help Me With HIPAA" podcast. Here's my conversation with Donna Grindle.
Donna Grindle: So we're waiting to see there's an NPRM, or notice of proposed rule-making, for changes to the privacy rule that's sitting out there right now for comment. There was already implementation of interoperability and information-blocking rules from the 21st Century Cures Act. And then January 5, there was an amendment signed into law for the HITECH Act dealing with recognized security practices. All of that's happening right now.
Dave Bittner: Well, let's dig into the HITECH Act. First of all, can you just give us an overview? What is the HITECH Act?
Donna Grindle: So the HITECH Act was signed as part of what we know as the stimulus bill, the ARRA in 2009. And so it was the health care part of that huge stimulus bill. It included several different things, but the one big thing was funding to help push the health care industry towards electronic medical records because it was lagging behind on technology. And it became known at that time as the Meaningful Use program.
Donna Grindle: And if you were a certified EHR - so all of these vendors jumped into the market to become a certified EHR because if a hospital or doctor's office implemented one and then proved they were meaningfully using it, then they got funding to help pay for the cost of installing and securing and all of those things. So we're talking thousands and thousands of dollars that were rolling into health care to put these things in.
Dave Bittner: Is that why my kid's pediatrician and my primary care physician started using tablets all of a sudden?
Donna Grindle: Yeah.
Dave Bittner: (Laughter).
Donna Grindle: Really, a lot of that goes back to that, yeah.
Dave Bittner: (Laughter).
Donna Grindle: You know, it just - the whole industry started moving. Whether they were - the Meaningful Use program applied to them or not, now the industry standard was electronic medical records.
Donna Grindle: Once that kicked in, another part of it was saying, OK, we're going to stiffen up the rules for privacy, security. We're going to add enforcement, which was never part of HIPAA, really. I mean, there was. But voluntary compliance - we kind of call it, like - it's like a speed limit. It's a really strong suggestion. And so they had changed that. That's where HITECH added the enforcement. Everybody yells about 1.5 million today. That's where it came from, as part of the HITECH Act. And that actual enforcement piece is what got the amendment in January 2021.
Dave Bittner: Well, take us through the amendment. What's in there?
Donna Grindle: It's pretty simple. It just says if you can prove reasonably - everything's reasonable and appropriate under HIPAA and HITECH. But if you can prove that you have been following what they define as recognized security practices for the previous 12 months, then OCR must take that under consideration. When they're doing an audit on your compliance program, they should consider that a reason to give you an early favorable end to your audit. They should consider it if there is a violation and they're looking at the penalties that would apply. And they should consider it if they are going to do a settlement with you as far as the settlement terms.
Donna Grindle: So basically, it's a carrot instead of a stick, that if you'll do these things, we're supposed to take it into consideration. Downside is there's no guarantees, and there's no definition of what consideration means. But, you know, first, can you show you've been doing this for 12 months before that conversation even comes into play?
Dave Bittner: So the clock starts whenever you implement the things that meet these requirements.
Donna Grindle: Right. And so they define the recognized security practices. They specifically mention the NIST Cybersecurity Framework, which, you know, critical infrastructure, health care technically falls across two or sometimes three of those, depending on how you count. And that's been there for 2014, 2016 - 2018 it got updates. So it's been out there a while, and there's been a lot of discussions.
Donna Grindle: I frankly have always kind of planned for that to be where HIPAA goes - is in that direction towards the Cybersecurity Framework 'cause when you think about it, the security rule hasn't been updated since 2005 ever - zero changes. That's the good news that it's been that flexible. The bad news is that was before we even had iPhones, you know, if you think about how different things are. So that's where these things help.
Donna Grindle: So the NIST Cybersecurity Framework, and then the other thing they specifically mentioned was the 405(d) programs, which is the health industry cybersecurity practices that were produced out of the Cybersecurity Act of 2015. And I don't know how familiar you are - most of that was governmental kind of things, you know, making sure we had trained workforce, making sure we had funding to secure national security. But clearly, we've still got issues there - SolarWinds.
Donna Grindle: But the other piece was the - they had this other section. And one section, the only industry singled out was health care. That's quite telling that we needed to improve cybersecurity so much that it was singled out in the National Cybersecurity Act. And that final piece - there were a lot of pieces, but 405(d) is the piece that developed a guide designed to help health care entities implement cybersecurity. And it focused on five threats and said, if we can give you 10 - you know, we can't call it best practices 'cause lawyers.
Dave Bittner: (Laughter).
Donna Grindle: And you can't call it recommendations. We have to call it considerations, all of those kind of things. But if you do these 10 practices and sub - with their associated subpractices, it would then help you mitigate these five overreaching threats, which are phishing, social engineering, ransomware, loss or theft of device, insider issues, both accidental and malicious, and then connected medical devices.
Donna Grindle: So it's really great. I kind of see it because it cross-references to the NIST Cybersecurity Framework that it lets you kind of have a health care guide to implementing NIST. And that's the way we kind of look at it. And, in fact, I joined the 405(d) committee - that they're continuing to update and develop things through these volunteer task force. And so I joined it in 2019, very excited about the things that, you know, were going to come out in 2020, but we all know.
Dave Bittner: Yeah.
Donna Grindle: But the 405(d) lets you pick - it functions a lot more like the CIS20, where you - are you a small organization, a medium organization, a large organization? And then this is the way you should approach it based on the size and complexity of your organization. So we're big fans of that, you know, being on the committee and all. It does make me spend a lot more time with it. But we're recommending that people use that. If you're not already down the path with the NIST Cybersecurity Framework, get the HICP - H-I-C-P 'cause nerds.
Dave Bittner: (Laughter).
Donna Grindle: And there's several guides. And a lot of things are going to be added in there now that it's part of these recommendations. You know, now we're a recognized security practice. Everybody's like, we better sit taller in the chair. But that's a huge thing, you know, but they have to, of course, define how they're going to implement it, define how they're going to do these things.
Donna Grindle: But still, HIPAA for all these years has been like the menu of security things you need to address. You need to address all of these things, passwords and network security and inventory and all of that. But it's just the menu. It doesn't tell you how to do it. And we look at the cybersecurity, the 405(d) and NIST as, here are the ingredients and some selected recipes to create the things you need to have on your menu.
Dave Bittner: Now, there's some stuff going on when it comes to patients' rights to access their own medical records.
Donna Grindle: Oh, yeah.
Dave Bittner: Can you bring us up to date on the activities there?
Donna Grindle: A bunch going on. I've never seen them on an enforcement tear like they've been with this. So they announced in 2019 they were going to start an enforcement initiative to ensure patients' right to access of their records was being met. And that was defined in HITECH with very specific rules. And they were getting so many complaints. And they tried outrage, and they tried all this. So they finally said, we're bringing down the hammer, and they've had 16 cases that they resolved - that's more than they normally do in a single year across all cases.
Donna Grindle: So since they did the first one in 2019, and then when they were finally able to open things back up and start their enforcement actions, you know, after the big COVID thing, September, they announced five on one day. Never seen them do that. Then they had October, November, December, January, February, we've had cases announced, enforcement actions. Never seen them do that.
Donna Grindle: So here's the big deal. And in most of these cases, there's really two areas that there are failures. There's a reason that this is a patient right that they really want to push. If we look at the whole rebuilding of the U.S. health care system, it involves more patient engagement. And if patients can't get access to their records, it really becomes hard for them to be engaged in the process, right? Everybody's like, what are you writing down over there? What are you writing down?
Dave Bittner: Right.
Donna Grindle: So now they're making it clear that everybody needs to follow the rules. And right now, as the rules are, it's from the time a patient makes a request - that shouldn't be super hard to do. It's not like I should have to go out and get notary and all this other stuff to prove I'm me.
Dave Bittner: (Laughter).
Donna Grindle: I shouldn't have to jump through hoops. But - yeah. But at the same time, I should be able to prove I'm me and it's not just, you know, some random person calling up and wanting it - so, you know, a balance, you know, but I still don't need to have character references and stuff. So, one, it shouldn't be hard for me to request them. When I do request them, then you should send them to me in the form and format I request if you can create them.
Donna Grindle: There's this habit for years that, even though now records were stored electronically, suddenly I'm still getting a big stack of paper when I want my records. And everybody's like, yeah, no. So the industry is trying to figure this out. You know, hey, you can't just keep doing what you've always done when you have new things. New toys mean new things. And being able to request an electronic copy of my records - and generally, it's going to be on a .PDF, but it's not going to be me getting a stack of paper that I need to scan to get it to somebody else.
Dave Bittner: Help me understand here, and forgive how naive this question is. But do I have a master medical record? Is there one record, or are my records scattered about? And if so, why don't I have a master medical record?
Donna Grindle: No, you do not. They are scattered about, scattered to the wind.
Dave Bittner: (Laughter) OK.
Donna Grindle: And that's why we always say, you can cancel a credit card; you can't cancel your medical record. So medical identity theft is a real problem. People don't understand it until it happens to them. But if I were to, you know, get your information and go and file your insurance and say that I'm you at a hospital in another state and all of my records get in there and then you end up, say, in a car accident in that state at that hospital, they'll say, yeah, we've had them here before. And they're going to use my blood type, my - you know, if you're not awake enough to know it. So it can be quite dangerous. Generally, it's financially devastating because people ditch on these - there's horror stories, let me just say.
Dave Bittner: Sure.
Donna Grindle: But that's why you can't cancel them - because there's not one main one. The reason there's not one main one is that we don't have a main health care system. Once you get to Medicare, now, you know, the Medicare systems know, but it's still spread out. Yeah. So we don't have a main system. And there's all of these health HIEs, health information exchanges, popping up, which give - my privacy radar goes off, and I freak out over them.
Dave Bittner: You know, for me, it's frustrating even if I have to fill out a medical history for something because...
Donna Grindle: Yeah.
Dave Bittner: I'll say to myself, well, you know, I - gosh, you know, I had that minor surgery 20 years ago, but I don't remember the name of the doctor. And I don't - you know, if only there were a place I could look all this stuff up. But it just doesn't - it just seems like in this modern age in which we live, we should be farther along than we are.
Donna Grindle: Well, that's because, I mean, honestly, health care - until, like, 2013, most people didn't have electronic medical records, so they're still evolving. So you got to keep that in mind - is, you know, you're really not even 10 years down the path of the majority of health care information being electronic. So with that in mind, now we also have - remember all those vendors I told you that rush to make it?
Dave Bittner: Right.
Donna Grindle: They have all these requirements to be a certified health IT, but then they essentially created silos because there was no original thought of, how are we going to exchange this information? And that's where the 21st Century Cures Act comes in, which the information-blocking rule defines what information-blocking practices are and how you shouldn't do it.
Donna Grindle: So it creates a little confusion. You know, we're going to start information-blocking. It becomes effective April 5. So you want me to start blocking then? Or, you know, it's confusing. But what it says is there are all these things that, again, allow patients better access and better control over their records. And that's what's intended in there - is interoperability to be able to say, hey, you know, I want a central record, or, I have my primary care, and everything's there. So you just shoot it over here, will you? You know, just (vocalizing). I should be able to do that.
Dave Bittner: Right.
Donna Grindle: That's what the intent of the 21st Century Cures Act is. You know, and we know about those things. There's intent.
Donna Grindle: We don't know where we're going just yet. But, I mean, it's promising.
Dave Bittner: Well, that was my next question. I mean, are you optimistic? As - you know, as you look towards the horizon, do you feel like people in good faith are trying to send us in the right direction here?
Donna Grindle: In a lot of ways, yes. I mean, certainly, COVID has created a lot of barriers and confusion. I mean, if you think what the health care industry has been through, either they have been overwhelmed or shut down and then overwhelmed or, you know, asked to limit your number of patients that you can see, but, oh, by the way, you know, you don't get to control your pricing. Insurance companies do, really and truly. You know what I mean? The insurance company - that's why I always love that - it's a capitalist - no, it's not. The insurance companies are deciding, not the providers of care.
Donna Grindle: But I see that there are some attempts to limit - you know, just by the recognized security practices. OK, now, I'm not just going to say you have to do this or I'm going to charge you a lot of money. It's, these are really the things you should be doing, and we'll reward you for doing them.
Donna Grindle: And same thing with - it's unfortunate that they're having to do so much of the enforcement on patient records. They're supposed to be able to get it to you for - within 30 days of your request, and people are waiting a year in some cases. And sometimes it's just, like, one lab report that they really need isn't included.
Donna Grindle: The bad news for people that aren't meeting that 30 days is there are proposed changes to the privacy rule that would shorten that to 15 days. And those, we don't know exactly what's going to happen. Those are out for comment right now on the Federal Register.
Donna Grindle: You know, there's a lot of debate in the industry about whether things are just going to sit for a while. Are all of these things going to get delayed, or are they going to go ahead and be implemented? Just - we're in a transition in the middle of a pandemic. And on top - you know?
Dave Bittner: I want to log on to a website, Donna. I want to log on to a website.
Donna Grindle: (Laughter).
Dave Bittner: I want to see all my medical record for my whole life. Just let me log on to a website. Why is that so hard?
Donna Grindle: 'Cause I don't know where your data is. I probably can't find it.
Dave Bittner: (Laughter) Oh, man.
Donna Grindle: (Laughter) So, yeah, there's a lot really up in the air. And I'm anticipating between now and June, a lot's either going to get pushed out - I mean, if you look at the - now that the proposed rule-making for privacy changes - comment period ends March 22, the information-blocking rule is supposed to become effective. It was November. They put it off to April 5. And then we've got the newly signed recognized security practices that's sitting out there in - it's part of the law.
Donna Grindle: So there's a lot that either needs to be pushed out or is just going to start happening because of the, you know, time frames that are built into the law with this. So it's going to be - it's really interesting to see, you know, 'cause there's just so much to overcome. I mean, the most important thing that health care's dealing with right now is getting vaccinations.
Dave Bittner: Right.
Donna Grindle: You know, both within the industry - and believe me, there's a lot of people that are mistrustful of the vaccinations, even working in health care.
Dave Bittner: Right.
Donna Grindle: And I'll stick my arm out there right away. Go ahead. Poke me. I don't care.
Dave Bittner: Yeah. Me, too.
Donna Grindle: I'll try it. I'll try anything (laughter).
Dave Bittner: Yeah. Yeah.
Donna Grindle: So I'm fan of patience with all of this stuff and understanding and - instead of just kind of bringing the hammer down because these are real fears that people have, real concerns. You know, it's not trying to butt (ph) the system with most people.
Dave Bittner: All right, Ben, what do you think?
Ben Yelin: Always good to hear from Donna and just to hear her perspective on the evolution of HIPAA, how far we've come...
Dave Bittner: Yeah.
Ben Yelin: ...Since the 2009 ARRA, when there was this whole question of meaningful use for electronic health records and how that's evolved with the HITECH Act and new statutes. I just always love hearing Donna's voice and how she can explain things so clearly and concisely.
Dave Bittner: Yeah. I have to say I'm really glad that we have folks like Donna out there who are willing to take the time to dig into this stuff and then translate it, you know, for mere mortals like us - well, you're a lawyer, so you're not a mere mortal, but I am (laughter) - so we can better understand.
Ben Yelin: Yeah, she does it so we don't have to. Yeah.
Dave Bittner: (Laughter) There you go. There you go. So again...
Ben Yelin: Even lawyers don't like reading HIPAA regulations, except, perhaps, Donna Grindle.
Dave Bittner: OK, sure (laughter). All right. Well, again, our thanks to Donna Grindle from Kardon for joining us. We do appreciate her taking the time.
Dave Bittner: That is our show. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.