Helping us understand HIPAA.
Donna Grindle: There is a perception that everybody worries about HIPAA. No, it's not. What they worry about is patient confidentiality.
Dave Bittner: Hello, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hi, Dave.
Dave Bittner: On this week's show, Ben shares a story about surveillance in public schools. I describe a case going before the U.S. Supreme Court on whether or not a state can put its laws behind a paywall. And later in the show, I speak with Donna Grindle. She's the founder and CEO of a firm called Kardon, and she's one of the hosts of the "Help Me With HIPAA" podcast. While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be right back after a word from our sponsors.
Dave Bittner: And now a few words from our sponsors at KnowBe4. You know compliance isn't the same thing as security, right? How many times have we all heard that? And it's true, too. Having checked the legal and regulatory boxes won't necessarily keep the bad actors out. They're out-of-the-checkbox kinds of thinkers. But what about compliance itself? You've heard of legal exposure and regulatory risk. And trust us, friend, they're not pretty. So again, what about compliance? We'll hear more on this from KnowBe4 later in the show. It's not a trick question, either.
Dave Bittner: And we are back. Ben, you want to get things started for us this week?
Ben Yelin: Sure. So my article this week comes from The Guardian, a British publication, but do not hold it against them. It is a...
Dave Bittner: (Laughter).
Ben Yelin: ...Very interesting article about surveillance at U.S. public schools.
Dave Bittner: OK.
Ben Yelin: So part of this article doesn't relate to digital surveillance. It's about how some schools are forcing students to carry clear backpacks. And obviously, this comes out of concern for school safety based on incidents of gun violence in many schools across the country.
Dave Bittner: Right.
Ben Yelin: But a huge portion of this article is about digital surveillance at public schools and how incredibly pervasive it is. There is real-time monitoring on school-issued laptops and other school-issued devices. And oftentimes, students don't have any meaningful choice into whether to use those devices. Even if they're not using school-issued devices, sometimes they have to use school software, so there might be an online assignment. They have to log into some sort of Google portal. They can be monitored from inside that portal. And then I think most strikingly to me is that public schools now are using technology to monitor students' social media accounts - so Facebook, TikTok, whatever the students are using these days...
Dave Bittner: Yeah (laughter).
Ben Yelin: ...Probably not Facebook. That's more of a it's more of a dad and grandma thing...
Dave Bittner: Yes.
Ben Yelin: ...To basically do a threat assessment on students at the school. So this is something that parents and students are getting used to. It's a relatively new phenomenon. We are in the infancy of an age where a lot of student work is being done on digital platforms. But I think there are major potential privacy concerns here, even though we're talking about, you know, kids, people who are under 18 who generally don't think about it, you know, as having the same sort of privacy interests that adults have.
Dave Bittner: Right, and also within their schools because I think that's sort of a separate category also. In other words, at your school, you know, the principal always had the right to search your locker.
Ben Yelin: Absolutely. And there's actually a Supreme Court case that governs just this issue. So it was a 1980s case, New Jersey v. T.L.O. This was about confiscation of cigarettes and illicit drugs in a student's backpack. And it set the standard for searches of public school students. So the holding that was beneficial to public school students is that the Fourth Amendment does apply to kids in a public school, which is really interesting.
Dave Bittner: How would that play out?
Ben Yelin: The test is diminished, so it doesn't play out the way, you know, the test plays out in a non-public school setting where...
Dave Bittner: OK.
Ben Yelin: ...Generally, for a search to be reasonable, law enforcement has to get a traditional warrant to conduct that search.
Dave Bittner: OK.
Ben Yelin: What the Supreme Court has said is that wouldn't be reasonable in a public school. We're talking about kids here. Schools have their own policies. So, you know, they have an interest in maintaining safety and maintaining enforcement of those policies. So they don't have to really abide by those strict standards.
Dave Bittner: OK.
Ben Yelin: So they said sort of a lessened standard, and it's a two-part test. Any search of any student property has to be, A, justified at its inception - so based on something, can't be suspicionless - and reasonably related in scope to the original purpose of the search. So what that means is teachers, principals, whatever, cannot go on fishing expeditions and just kind of scrounge through all the students' belongings.
Dave Bittner: Right. I can't walk into a classroom and say, everybody, empty your backpacks out on your desks.
Ben Yelin: Right. Exactly.
Dave Bittner: But if a kid comes to me and says, hey, I think Bobby's got a knife in his backpack, now I've got a reasonable reason to...
Ben Yelin: Yeah. I mean, that kid might get a swirly later in the day...
Dave Bittner: (Laughter).
Ben Yelin: ...But for the time being, yes. And that's not something I'm encouraging. No bullying. But for the time being, yes. As long as there's any small level of suspicion, the school would be justified in conducting that search.
Dave Bittner: OK.
Ben Yelin: Now, this gets very tricky when we're talking about searches of digital devices. When we're talking about school-issued laptops or school-issued devices, there's really a greatly lessened expectation of privacy. Students probably should know that since it's a school-issued device, it's being constantly monitored. That's just sort of the nature of the beast.
Dave Bittner: Yeah.
Ben Yelin: But when we talk about social media monitoring, that's interesting to me because, to me, this is being done on a pretty broad scale, and it's largely suspicionless. I mean, I don't know exactly how it works, but - and it's not like they're always receiving an indication that some particular student has written something problematic on social media. It seems more generalized, like they are searching social media accounts to get a better assessment on what the threats are to that particular school. And that's where it could get problematic and could potentially run afoul of some of the case law. There has been very limited case law in this area as it relates to digital devices. So far, everything that's out there relates to individual searches.
Ben Yelin: So, like, can a teacher - if he or she sees a student reading their email in class or checking their Facebook in class - can they be like, hey, Timmy, hand over your phone, I'm going to look at it? Some courts have said that that's generally acceptable, especially if a school has a policy banning cellphones. Those cases are one thing, but when we're talking about something pervasive where there's sort of a generalized policy of surveillance, that's when I think things get more complicated and potentially problematic.
Dave Bittner: Yeah, that's interesting, especially surveillance outside the classroom...
Ben Yelin: Right.
Dave Bittner: ...In the public square. Is it fair to say that these social media platforms where these students are posting things publicly, is that the public square? And what I think of is, you know, would a school have an interest if one of the students went to the true public square and started yelling out threats to other students or to the principal?
Ben Yelin: Right, if it was on private property or something. You know, it's really hard to say. Would they have constitutional First Amendment rights in that speech? If it's off school property, probably, unless there was some sort of direct threat that would cause imminent lawless action.
Dave Bittner: But we're in the real world (laughter). You know, like, if I'm the principal of that school and I get a notice or, you know, somebody says that, you know, little Bobby or Timmy or Jane were yelling out threats to me in the public square, I'm sure going to have a...
Ben Yelin: Somebody's going to the principal's office.
Dave Bittner: ...I'm going to have a conversation with that kid's parents, right?
Ben Yelin: Right. And that's probably what happens in the real world.
Dave Bittner: Yeah.
Ben Yelin: What concerns me about social media is perhaps, especially when we're talking about high school students, there might be a desire to be interesting, provocative. It is a venue to express thoughts.
Dave Bittner: Right.
Ben Yelin: And even if those thoughts are controversial, they are being done in a non-school setting. And so would that potentially have a chilling effect on an entire generation of children? If it's not just those school-issued laptop devices, and there's a sense amongst students that they're being monitored everywhere, does that change some of the marketplace of ideas that exists among public school students? You know, this article talked about how the school discovered that one of its students - I think it was a 10-year-old - had been searching online about gun violence in schools. School discovered it through this use of surveillance and brought that issue to the principal. And that student's parents said, look, he has an academic interest in school shootings. He was doing legitimate research. Are students going to have second thoughts about conducting meaningful research just because it's on a potentially controversial topic? It's hard to tell.
Dave Bittner: And I think about this a lot. (Laughter) It makes me anxious. The anxiety of living in the panopticon...
Ben Yelin: Right.
Dave Bittner: ...Right? - of knowing that there's someone always looking over your shoulder, to wonder, am I ever really in a situation where my privacy is assured? If we're raising children with that expectation rather than the expectation that if I need to take a break, if I need to take a walk, if I need to take, you know, a lap around the school hallways to clear my head...
Ben Yelin: Right. You can have some solitude...
Dave Bittner: Right.
Ben Yelin: ...Privacy.
Dave Bittner: But that doesn't exist. The other side of that is I totally understand, given where we are today, particularly in this nation with gun violence, that parents, teachers and students want to make use of every available tool to try to protect themselves and their children.
Ben Yelin: Absolutely.
Dave Bittner: That impulse is crystal clear.
Ben Yelin: Yeah. And that's the context of this entire story is that...
Dave Bittner: Yeah.
Ben Yelin: ...This is not happening in isolation. It's responding to a very real threat. This is your classic balancing task. You really have to balance ameliorating this very serious security threat to our students and giving students a little bit of autonomy, a little bit of space to be creative and to have private conversations in a venue where they don't feel like they're constantly being monitored.
Dave Bittner: Right.
Ben Yelin: And instead, it's just becoming sort of a fact of life. You know, one of the striking things to me about this article, so many students seem to just have an accepting attitude towards it - that I understand that this is the way it has to be. I have to wear a clear backpack. I know my electronic communications are being monitored. This is the age of Parkville. This is the age of Columbine, you know, and I completely understand that impulse. And courts have as well. Public schools carry a special place in our constitutional jurisprudence just because we are talking about kids. Their brains are smaller. They do stupid things. So we need to check on them more.
Dave Bittner: Right, right. They're still figuring it all out.
Ben Yelin: Exactly.
Dave Bittner: Yeah, and they're impulsive.
Ben Yelin: They're very impulsive.
Dave Bittner: Yeah.
Ben Yelin: So, you know, you understand having a lessened standard in order to conduct these searches. But when it becomes so broad that it could potentially be having an effect on an entire generation, that's where I would start to get worried.
Dave Bittner: I have seen stories that we've covered over on the CyberWire about some of these systems that try to use artificial intelligence to monitor audio. So in other words, there's a computer that's always listening. There are microphones all over the school - with the cameras come microphones - and they're analyzing for potential disturbances - for raised voices, for yelling, things like gunshots.
Ben Yelin: Right.
Dave Bittner: But even hot arguments and looking for things that could potentially be getting out of control. And you and I certainly have talked about the reliability of things like facial recognition and artificial intelligence.
Ben Yelin: Right. They are not as reliable as they should be and are probably far less reliable than straight-up, scary principal roaming the hallway, making sure kids aren't up to anything.
Dave Bittner: I also wonder, does this push those conversations farther into the shadows? And do we want that? Is it good that sometimes kids work things out in front of each other, you know, in front of us as adults? You know what I'm saying? (Laughter).
Ben Yelin: Yeah. And where - you know, where are they going to go in the absence of being in a school hallway - which is probably the safest place for some of these things to develop - as long as we're not talking about actual violence. But, like, conflict between students, you'd rather have that happen in a place where there are teachers, and there are counselors, and there are principals.
Dave Bittner: Right. There are people who can help them work through these things.
Ben Yelin: I mean, I sort of think about there are rules in a lot of our local malls that students under 18 are not permitted to be there after 5 p.m., you know, on weekends. And it's sort of like, I understand the impulse there. Kids can get crazy at malls. They can be very annoying. I've been there in those situations.
Dave Bittner: I've been one of those annoying kids (laughter).
Ben Yelin: Me, too. But where are they supposed to go? And if they're not going to the mall, they're probably going to be going someplace more dangerous and someplace that's less protected, less safe, less monitored.
Dave Bittner: Yeah.
Ben Yelin: So you do have try and strike that balance, you know? I would certainly understand - and I am a parent now, and you are a parent - when we send our kids to school, we have an expectation that they're going to be safe. And when that threat is in the back of our minds, you can almost justify anything. And I really do understand that impulse. It's just do you take it too far when you have things like artificial intelligence with audio or constant video surveillance, constant online surveillance, where kids aren't allowed to be kids?
Dave Bittner: All right. Well, let's move on to my story this week. This is a completely different topic but one that is also quite interesting, I think. This is a story from Ars Technica written by Timothy B. Lee. And this is about a case that is going in front of the U.S. Supreme Court about the state of Georgia putting their laws behind a paywall. And basically, what it comes down to is whether or not these sorts of things are in the public domain or not. And there's some nuance here. I'm going to allow you to sort of unpack it and describe it because this is something you understand better than I do. So what's going on here, Ben?
Ben Yelin: So the state of Georgia works as part of a official agreement with LexisNexis. When they post the official state code online, that code is annotated, and it's annotated by LexisNexis. Now, LexisNexis, as a private company, wants to collect revenue for people to use its services.
Dave Bittner: Right.
Ben Yelin: So these annotations, as part of this agreement with the state of Georgia, are behind a paywall. You have to pay to read the state code for the state of Georgia if you want to have these annotations. And oftentimes, these annotations are just as relevant as the statutory language.
Dave Bittner: Break that down for me. So what's the difference between the two things?
Ben Yelin: There's the statutes themselves, which just - it's plain language. The law prevents you from doing X. This is what the penalty is.
Dave Bittner: OK.
Ben Yelin: Annotations gives background, context, legislative history, references. It's sort of putting a fuller picture together about the justification of that law.
Dave Bittner: OK.
Ben Yelin: Most code that lawyers read is annotated code because that does have that added context. It's not just the plain language on the page that's seemingly handed down on tablets.
Dave Bittner: OK.
Ben Yelin: What the state of Georgia and LexisNexis is saying - and this is what they said in the oral argument to the Supreme Court - is there is a public copy of the Georgia Code. It is online. It is available to the public. However, it just does not contain those annotations.
Dave Bittner: Now, we should say that this all came to a head because there was an organization called publicresource.org, which is a nonprofit, and they publish public domain legal materials.
Ben Yelin: Right. And...
Dave Bittner: So they were putting the annotated code out, publishing it publicly. And that's where they ran afoul with LexisNexis.
Ben Yelin: Right. So that's the nature of the dispute. They obviously have an interest as a public organization in getting the full annotated Georgia Code out there available to the public. That's part of their mission. Georgia sued them because they have this agreement with LexisNexis, and that's how the dispute got up to the Supreme Court. So LexisNexis and Georgia are saying that they post a free version of the code online without the annotations. People can look at it even if you are not a subscriber to LexisNexis.
Ben Yelin: Now, in reality, most lawyers, through their firms or through whatever institutions they work at, probably have LexisNexis access anyway. So this is more - this more applies to the general public. And, you know, is the general public going to understand Georgia Code, you know, annotated or unannotated? I think that's also a reasonable question. But annotations - because they add context and interpretive language - they are part of the law of the state of Georgia, and you want the public to have complete, full notice as to the laws that they're subjected to. And because of this paywall, unless you purchase that LexisNexis subscription, you do not have access to the official law of the state of Georgia.
Ben Yelin: And, you know, I think that's something that's problematic not just to this advocacy group that brought the lawsuit but brought on some skepticism at the Supreme Court. Justice Gorsuch seemed to be sort of very skeptical of the state of Georgia's argument. He didn't think it was appropriate to hide the official law of the state behind a paywall, and I think there's a lot of merit in that argument.
Dave Bittner: Where did these annotations come from? How did they get generated? And how did they become official?
Ben Yelin: So most annotations come from these services themselves. So these legal online libraries like Westlaw and Lexis will generate these annotations based on their own research of secondary sources, attorney general interpretations of opinions, legal forms, et cetera. And they'll also come through the statute to try and include specific references to court cases that reference that language in the statute. So if you're reading the statute, you're not only getting that language, but it'll say in the annotations, this was interpreted in So-And-So v. Georgia in 1985, and the court held X.
Dave Bittner: I see.
Ben Yelin: So in fairness to LexisNexis, they are doing the work here of annotating this code.
Dave Bittner: Right. And there's value added there.
Ben Yelin: There is absolutely value added. And certainly, I understand why they would want to be compensated for it.
Dave Bittner: Right.
Ben Yelin: But when the annotated code actually becomes the official law of the state of Georgia and you cannot understand the laws that you are subjected to as a Georgia citizen without seeing those annotations and that full context, then you're sort of depriving citizens of their rights to know exactly what laws they have to comply with. The public per legal doctrine is generally presumed to have knowledge of the law. In other words, if you're accused, particularly in a criminal offense, it is almost never a defense to say that you were not aware of the applicable law. And that becomes problematic when the applicable law is hidden behind a paywall. I don't think you can both have this inability to use mistake of law as a defense and laws that you'd have to pay to read or discover.
Dave Bittner: Now, does this ultimately sort of come down to commerce? I mean, it seems to me like perhaps the state of Georgia, for example, could provide access to LexisNexis through their library system. If Georgia, through the collection of taxes - right? - paid LexisNexis to give all the citizens of Georgia who have free library cards access to LexisNexis, then are we good here?
Ben Yelin: I mean, that seems to be the most obvious solution.
Dave Bittner: Yeah.
Ben Yelin: This might not be as beneficial to LexisNexis. Their entire business model is working with states to annotate the code and then forcing people to pay to access that code.
Dave Bittner: Right, which for a law professional makes total sense.
Ben Yelin: Absolutely.
Dave Bittner: Yeah.
Ben Yelin: But if there's an overarching interest in putting the annotated code in the public domain, perhaps that interest supersedes whatever business model LexisNexis has come up with.
Dave Bittner: Yeah.
Ben Yelin: And maybe there is a way where the state of Georgia and LexisNexis can sort of separate the two works, and that's something that's mentioned in this Ars Technica article. You could have Georgia publish an unannotated version of the state code. That's in the public domain. LexisNexis could independently produce and publish an annotated guide, keep the copyright, et cetera, et cetera. What the state of Georgia says to that argument is that the statute and the annotations are so closely related you'd need to have this intermingling of the statute and annotations to finance the creation of these annotations in the first place. And so is that something that LexisNexis would agree to if there was a different financing arrangement? I'm just not sure.
Dave Bittner: Yeah. All right. Well, the Supreme Court is going to have their say (laughter). We'll keep an eye on that.
Ben Yelin: Yeah. I mean, we'll see if they - they could come up with some sort of broad decision that says state-published legal documents, whether they're annotated or unannotated, are always in the public domain, meaning they cannot be copyrighted, whether or not those annotations are legally binding, as they are in this case. Based on what we've seen at oral arguments, I think it's possible that they could issue such a sweeping decision.
Dave Bittner: All right. Well, it is time to move on to our Listener on the Line.
0:21:10:(SOUNDBITE OF PHONE DIALING)
Dave Bittner: Our caller this week is from Atlanta. He has an interesting question, something I've wondered about myself. Let's have a listen.
Ted: Hey, Ben and Dave. This is Ted (ph) from Atlanta. I hate speed cameras. I find the whole notion of automated law enforcement creepy. My question is, how can it be that if a police officer pulls me over for speeding, it's a moving violation and I can get points on my license, but if a speed camera nabs me, it's just a civil violation? How can the same crime be different depending on how it's documented?
Dave Bittner: OK. What do you think?
Ben Yelin: It's a very good question, Ted from Atlanta. As somebody who has - I hate to admit - received several of these speed camera violations in the mail...
Dave Bittner: (Laughter) Bit of a lead-foot there, Ben?
Ben Yelin: Yes. This is of interest to me.
Dave Bittner: (Laughter) OK.
Ben Yelin: Really, this is a public policy decision. The state has decided that in order to protect public safety, it's worth it to have these speed cameras.
Dave Bittner: Right.
Ben Yelin: But these speed cameras aren't as reliable as an officer, you know, using a radar gun, pulling you over, issuing a citation, putting points on your license, et cetera. They're saying it's valuable enough for us to have these speed cameras, but in fairness to drivers, because this technology is not as reliable, because there have been false positives, the penalties are going to be less severe. The civil penalties are going to be smaller. They're generally, here in Maryland, in the range of, like, $30 to $40. And this won't go on your permanent driving record.
Dave Bittner: Right.
Ben Yelin: I think that just has to do with the state sort of admitting that these speed cameras have been problematic in the past. They've issued speeding citations where they were not merited. The process for contesting these violations is so cumbersome that your average person just does not have the time and resources to really contest it. In terms of the resources it requires the state to use for a traditional traffic stop, you can understand why the consequences would be more severe.
Ben Yelin: If you are driving sufficiently recklessly that a police officer who is observing you with his or her eyes or a radar gun is able to detect that speeding or reckless driving and pull you over, then the consequences should be more severe. But if it's this artificial technology where they're simply taking a picture if they sense that the driver is driving above the speed limit, then there should be that lessened penalty. So I think it's really just sort of a public policy decision that state and local jurisdictions have made.
Dave Bittner: Sounds like a workaround to me, Ben. I don't buy it (laughter). So let's get to the core question here, which is, how can you have two different penalties for the same crime, depending on how it is documented?
Ben Yelin: It's a great question. I think the realistic answer is speed cameras catch more people than police officers do. If, let's say, the state legislature or the local city council decided to issue criminal penalties, enhance sanctions, points on your license, et cetera, et cetera, there would be a public outcry because...
Dave Bittner: Ah (laughter). Right.
Ben Yelin: ...Very few people can be pulled over by law enforcement. Just as a matter of pure resources, it is very expensive to put cops on the beat, to put transportation police on the beat. There's a limited number of police cars that can be out on public highways. It's pretty cheap and pretty easy for a state government or a local government to simply put up a pole with a camera. You do it once. You have to maintain it. They catch tons of people.
Dave Bittner: Right.
Ben Yelin: Can you imagine the public outcry if all of us, me included, were constantly getting these severe penalties, points on our license, threats to revoke our license every time we were caught going 40 in a 35? There would be a public outcry, and we wouldn't have these speed cameras to begin with.
Dave Bittner: Yeah.
Ben Yelin: And I think legislators know that. Anecdotally, I've heard a lot of people complain about speed cameras because they're sort of too good at what they do.
Dave Bittner: (Laughter).
Ben Yelin: Like, a law enforcement officer might not be able to detect that you're going 5 mile - you know, 5 miles an hour above the speed limit.
Dave Bittner: Right.
Ben Yelin: That's hard to observe with the naked eye. These speed cameras can detect that. And, you know, I think that's very frustrating for people.
Dave Bittner: Yeah.
Ben Yelin: So we'd all get caught. All of us would get points on our license. The state of Maryland, you know, might have fewer drivers on the road, for better or worse, because they would have revoked our licenses already. So I think that's just something that state and local governments have recognized and have reckoned with. If we're going to have this separate system which is very good at detecting speeding, that issues a ton of citations, the punishment can't be that severe, otherwise, there would be this significant public backlash.
Dave Bittner: Yeah.
Ben Yelin: I will say, there are few things more frustrating in life than going through an intersection, sensing that you've seen a flashing light and then that realization comes over you that's like, shoot...
Dave Bittner: (Laughter) Yeah.
Ben Yelin: ...They caught me.
Dave Bittner: Yeah.
Ben Yelin: In three weeks, I'm going to get a letter.
Dave Bittner: Yeah.
Ben Yelin: It's going to have a picture of my license plate.
Dave Bittner: Right.
Ben Yelin: Yeah.
Dave Bittner: Right.
Ben Yelin: I sort of have an informal contest going with my wife about these speeding cameras. So we'll see it in the mail, and it'll be like, all right...
Dave Bittner: Which one of us is it?
0:26:24:(LAUGHTER)
Ben Yelin: Yeah, which one of us is it? Let's check the details.
Dave Bittner: Right.
Ben Yelin: It's like, yep, that was me. I was there.
Dave Bittner: Pay our speed tax.
Ben Yelin: I was going 42 in a 40, and they caught me.
Dave Bittner: All right.
Ben Yelin: Yeah.
Dave Bittner: Well, not very satisfying, I understand, but still don't like it (laughter).
Ben Yelin: Yep. And thank you for the question.
Dave Bittner: Yeah, thank you, Ted, for calling in. We'd love to hear from you. If you have a question for us, our call-in number is 410-618-3720. And you can also write us at caveat@thecyberwire.com. Send in an audio file of your question so we can use that on the air.
Dave Bittner: Coming up next, my conversation with Donna Grindle. She is the founder and CEO of Kardon, and she is one of the hosts of the "Help Me With HIPAA" podcast. But first, a word from our sponsors.
Dave Bittner: And now back to that question we asked earlier about compliance. You know, compliance isn't security, but complying does bring a security all its own. Consider this - we've all heard of GDPR, whether we're in Europe or not. We all know HIPAA, especially if we're involved in health care. Federal contractors know about FedRAMP. And what are they up to in California with the Consumer Privacy Act? You may not be interested in Sacramento, but Sacramento is interested in you. It's a lot to keep track of, no matter how small or how large your organization is. And if you run afoul of the wrong requirement, well, it's not pretty. Regulatory risk can be like being gobbled to pieces by wolves or nibbled to death by ducks - neither is a good way to go. KnowBe4's KCM platform has a compliance module that addresses, in a nicely automated way, the many requirements every organization has to address. And KCM enables you to do it at half the cost in half the time. So don't throw yourself to the wolves and don't be nibbled to death by ducks.
Dave Bittner: And we are back. Ben, I recently had the pleasure speaking with Donna Grindle. She is the founder and CEO of a company called Kardon, which does a lot of consulting and training and education when it comes to HIPAA. And she's also one of the hosts of the "Help Me With HIPAA" podcast. She reached out to us. She was listening to our show, and she responded - maybe a few episodes ago, you and I were talking about some HIPAA things.
Ben Yelin: Yes, we were. Yes.
Dave Bittner: And so Donna reached out. And she said she had some comments, and I said, well, come on the show because we could certainly stand to learn a little more about HIPAA from an expert. So here's my conversation with Donna Grindle.
Donna Grindle: A lot of people don't realize it, but prior to HIPAA, there really was no medical privacy. Everybody thinks doctor-patient confidentiality was a law, but it really wasn't; it was just kind of an assumed thing until HIPAA was enacted in 1996. But then it didn't actually - even though it was passed in '96, the privacy part of it came into effect in 2003, followed by the code set standards, which are in there, as well as the security rule in 2005. So there's really a lot to the HIPAA law, that people mostly just understand the privacy part. But there's a lot there. The original version was called voluntary compliance. So that's kind of what we call like, the speed limit is voluntary.
0:29:38:(LAUGHTER)
Donna Grindle: So in the stimulus bills - what we most know it as, the ARRA - in 2009, they added - the HITECH Act was a tiny, little piece of that. But what its intent was, was to stimulate the economy - obviously, that's what the intent was - but by providing assistance and funding to the health care industry for implementing electronic records. And as part of that, you had to show that you were - meaningful use requirements, that you were actually using them; you weren't just buying them. And they beefed up HIPAA because they said, hey, we're going to have a whole lot more out there, and we see where we're going.
Donna Grindle: And it added enforcement. It added breach notification, along with some genetic requirements under the privacy rule and those kind of things. And that really changed the HIPAA universe because that's where it really added meat to the business associate requirements, which is what launched my original contact - podcast that talks about privacy law. I mean, that's what I do all day, every day. So I'm like, oh, nerdy stuff.
Dave Bittner: (Laughter) Well, let's dig into that some. I mean, Ben and I, on a previous show, we discussed the HIPAA business associate requirements. Can you lay that out for us? I mean, what are they? And in the real world, how do they play out?
Donna Grindle: Yeah, it's one of those things where when you tell people that you do HIPAA for a living, and their answer is, oh, that thing I signed at the doctor.
Dave Bittner: Mmm hmm.
Donna Grindle: Yeah, that's a little bit of it. But the concept is that those who provide care, provide payment for that care or process those payments are covered entities, but any company that provides a service to those covered entities that the nature of their work requires them to have access to that protected health information, they are then business associates and have to commit to providing the same types of security requirements and the privacy protections that the covered entities do. It's kind of like that chain of custody protections that a lot of people know from the legal shows. The concept is that if your job is going to require you to have this, then you have to do the same things I do to protect the privacy and security of the patient information. So the beauty of the high-tech law was it changed it to actually for the first time say that a business associate was separate and equally liable for the protections under HIPAA, which didn't exist before.
Dave Bittner: In terms of this actually playing out in the real world, how does that work? I mean, do organizations find loopholes around these sorts of things? What really happens?
Donna Grindle: Yeah, they try to. There's a lot of that. One of the reasons that the Office for Civil Rights, who is the HIPAA police, the office for Civil Rights under the Health and Human Services Department, they issued some very specific guidance early on that said it isn't the business associate contract that you're obligated to sign under HIPAA that makes you a business associate. It is the work that you do that makes you one. And there are still people who believe if I don't sign that contract, then I'm not obligated. Really, what that means is that you're in two kinds of trouble.
Dave Bittner: (Laughter) Go on.
Donna Grindle: (Laughter) Because by not signing the contract and doing the work, you are violating HIPAA right away. And that also means that your covered entity that you are contracting with or what we call the upstream business associate because it's a long tail - it doesn't stop just at that first level - those people who are allowing you to do the work without the contract in place, they're violating HIPAA every day. So you're in a double violation every single day that's occurring. One is you're doing the work that you should have a contract for. And then the second one is technically that's a data breach every day they have it because they're not authorized to have it. So that's big pile of trouble every day that you're doing it.
Dave Bittner: Wow. Help me understand. We have this flood of devices that are collecting personal data about us. You know, our watches are collecting information about our heartbeats and we're weighing ourselves. And we're - you know, women are tracking their cycles and all of these things that could be considered private medical information. Where does all that fall when it comes to that collection and HIPAA?
Donna Grindle: Nowhere (laughter). It's completely outside of HIPAA. That's one of the problems that we constantly are discussing in the industry because HIPAA, remember, only applies to those that are providing care or payment for that care, so the insurance companies or those providing care. These companies technically can do whatever they choose to put in their privacy policy with your data when you give it to them. Now, the caveat is - and this is that original discussion that you were having with Ben - was that if I as the covered entity, the - let's just - for simplicity, we'll say it's your doctor's office. The doctor wants to use a device to track your heartbeat. OK. And the doctor gives it to you. Now, it's covered under HIPAA because the doctor gave it to you. But if you go out and buy it yourself, there's no connection to HIPAA whatsoever. So that is the key piece.
Donna Grindle: And the discussion you were having was that this health record company had in their privacy policy that they were going to share for marketing purposes, and they were arguing that they weren't a business associate. You know, that's a common argument that we deal with on a daily basis almost. But that case, the particular product you were talking about in the article is owned by an electronic records company. So right away, you know, there's that piece of you already know you're a business associate. They have to be. Your IT companies are a big problem in a lot of cases. Not all of them. Many of them are very good at it. But it's particularly the smaller ones who think that HIPAA just means I got to do some security and sign some paperwork. And it's way more than that where there's that confusion that's just built in on who is one, who isn't one, even who's a covered entity. For example, if you use concierge medicine, you know, where it's pay in cash...
Dave Bittner: Right.
Donna Grindle: ...Or the very popular med spas...
Dave Bittner: Oh, yeah.
Donna Grindle: ...Where you pay in cash, if they never file an electronic claim for your care because you're paying in cash - they won't - they're not covered under HIPAA either.
Dave Bittner: Wow.
Donna Grindle: I know. It's quite tricky just figuring out who's covered and how they're covered and what role they play in the industry as a whole. And the health care industry is a behemoth. I mean, it's huge...
Dave Bittner: Yeah.
Donna Grindle: ...And quite complicated, and it is not getting any better.
Dave Bittner: As the cybersecurity industry heads down this path, the ball got rolling with GDPR as we're heading down this privacy legislation and regulation path. What does your experience with HIPAA and how that has affected a huge industry, what sort of insights or advice do you have for the folks who are at the leading edge of that journey in the cybersecurity realm?
Donna Grindle: First, I mean, I would want to consider the bleeding edge because it is very hard. Even in health care today, I have a hard time. There is a perception that everybody worries about HIPAA. No, it's not. What they worry about is patient confidentiality. Yes, most people do worry about that in the health care world. But when it gets down to the intricacies of HIPAA, even teaching my clients - and I say you can't look at your own records, and they have a fit. They're like, what do you mean? And I explain you're only supposed to look at records when it's part of doing your job and only if you're involved in treating the patient, collecting payment for that treatment or it's something specifically requires access to it to run the business, like an AR report or something like that.
Donna Grindle: If you're not doing one of those things, you shouldn't look at your own records. Well, how am I ever going to see them? Well, you're just like every other patient. You go through the same process. So when we have that at the health care level and, you know, - what? - 2003 that rule's been in place, yet they don't even - that didn't change in high tech. That's the same rule that's always been there since 2003. And you look at - you got CCPA, GDPR, Texas, Nevada, all of these other areas in the United States. Every state is enacting its own privacy rules. And some of those involve data breach notification, and they're at different timeframes and all these other things. And until there's federal action, we won't have that under control so that you can standardize it.
Donna Grindle: So as cybersecurity professionals, the most important thing you can do is understand that security doesn't make you compliant. So just assuming that if you're doing the security things, you're meeting regulations and meeting the regulations doesn't make you secure, which is what a lot of people do is they just do a gap analysis of, you know, do I have all of the policies and procedures in place? You have to do both. Use a framework, the CIS 20, the NIST cybersecurity framework or even health care published just in December, the - ironically, this is - health care is the regulated industry, the Cybersecurity Act of 2015 - you familiar with the CISA?
Dave Bittner: Mmm hmm. Sure.
Donna Grindle: So in that, it covered all of the federal government, cybersecurity, education, building the workforce. The only industry singled out in the national Cybersecurity Act was health care because they needed more cybersecurity. It is a problem. And as part of that, it's known as the CSA 405(d). There's a task force that met and was involved. They completed the initial pass, December 28, 2018. So it's almost a year. It's called hiccup (ph) because nerds.
Dave Bittner: (Laughter).
Donna Grindle: But it's HICP. If you look for it, it's like protecting patients, a big, long thing and hence hiccup. There's also now HICS (unintelligible), which is a whole nother thing. But that has to do with...
Dave Bittner: They do love their acronyms, don't they (laughter)?
Donna Grindle: I know, right? I love being a nerd, you know? It lets me make up words. That's how we have Google it.
Dave Bittner: Right.
Donna Grindle: But the HICP guide is designed for small, medium and large companies to be able to take that guide - there's a guide that gives you explanations of five threats that everybody deals with.
Dave Bittner: So is it fair to say that one of the lessons gained from what the medical industry has gone through with HIPAA is that none of this happens overnight. You know, this is a long journey.
Donna Grindle: Yes, very much so, and it's ongoing. It's a process of continuing improvement. It's not a once a year, once a week kind of thing. You need to think about it and live it all the time. So every single meeting, every decision, every thing that you discuss, somebody needs to say, does this have any privacy or security applications or problems or do we need to do anything about it? It should be part of your discussions, no matter what you're talking about. Well, maybe not lunch.
Dave Bittner: (Laughter).
Donna Grindle: But depending on where you work, it could be lunch if you listen to the stories, you know, of what some of these pen testers are able to do. But you know what I'm saying.
Dave Bittner: So, Ben, I don't know about you, but I am definitely going to subscribe to the "Help Me With HIPAA" podcast just to get to listen to Donna.
Ben Yelin: Oh, for sure. I'm sort of jealous that you got to interview her and I didn't because it was so entertaining.
Dave Bittner: Yeah, she's great. She's great.
Ben Yelin: Donna, if you ever want us to be on your podcast, we are very willing participants.
Dave Bittner: Say the word.
Ben Yelin: We are now part of the Donna Grindle fan club, so thank you for that. I thought you brought up some very interesting points during her interview. I think she gave great clarity on the business association relationship as it relates to HIPAA. So if the nature of your work requires you as an organization to have access to any health care information, you are a business association. You have to apply the same privacy and security practices as if you were one of the covered entities. And if there is a breach of that information, you are jointly liable with that health care provider. It doesn't seem like there's widespread knowledge in the industry, especially associations that aren't fully operating in the health care realm...
Dave Bittner: Yeah.
Ben Yelin: ...That they are subject to this liability.
Dave Bittner: I wonder how much of that is willful ignorance.
Ben Yelin: I'm sure a lot of it is.
Dave Bittner: (Laughter) I bet Donna has a take on that.
Ben Yelin: Yeah. And one thing that I think she made very clear, which is also interesting, is there's a long tail. You know, these covered entities have a lot of contractors, a lot of different relationships. For various reasons, a lot of organizations, as it relates to a single medical record, are going to at one point have access to that record. And it is a joint responsibility, both in an ethical sense but also in a legal sense, to safeguard that data. Another thing that stuck out to me in hearing this conversation is how helpful it is for health care organizations, covered entities and business associations to have clear guidance. And they have clear guidance because there is this federal statute. And even though as she said that statute has been constantly evolving, it's there. There's one federal law that deals with this area of information privacy.
Ben Yelin: You'd only need one Donna to fully understand the consequences of HIPAA for your organization. When it comes to data privacy in general, as she mentions and as we've mentioned, we don't have that yet because there really isn't a federal statute. And, you know, I think HIPAA actually sets a valuable example of we could have some sort of national clarity, some uniform standards that apply at every health organization across the country. And it's portable, meaning if you, you know, get trained in HIPAA compliance in Maryland, it's still applicable in Virginia. And it just makes life easier for people who work in the field who don't have a lot of time or resources to think about their legal liability.
Dave Bittner: Yeah.
Ben Yelin: So that's something that I think would be a major advantage of federal data privacy legislation.
Dave Bittner: Yeah. Really interesting insights. So our thanks to Donna Grindle for joining us. Her podcast is the "Help Me With HIPAA" podcast. Do check it out. We want to thank all of you for listening.
Dave Bittner: We want to thank our sponsors, KnowBe4. If you go to knowbe4.com/kcm, you can check out their innovative GRC platform. That's kb4.com/kcm. Request a demo and see how you can get audits done at half the cost in half the time.
Dave Bittner: We want to thank the University of Maryland Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.