Ray Walsh: [00:00:03] California's passed two laws, AB-730 and AB-602, and both of those laws are designed to help legislators to deal with deepfake technology.
Dave Bittner: [00:00:12] Hello, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland's Center for Health and Homeland Security. Morning, Ben.
Ben Yelin: [00:00:22] How you doin', Dave?
Dave Bittner: [00:00:23] On this week's show, Ben wonders just how private your medical records really are. I'll share a story about a woman compelled by a judge to unlock her iPhone. And later in the show, we have my interview with Ray Walsh from proprivacy.com. We'll be discussing some new legislation in California dealing with manipulated video and deepfakes. As always, we want to remind you that while this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: [00:00:54] And now a word from our sponsors, KnowBe4. And now a few thoughts from our sponsors at KnowBe4. What do you do with risk? We hear that you can basically do three things. You can accept it. You can transfer it. Or you can reduce it. And of course, you might wind up doing some mix of the three. But consider. Risk comes in many forms, and it comes from many places, including places you don't necessarily control. Many an organization has been clobbered by something they wish they'd seen coming. So what can you do to see it coming? Later in the show, we'll hear some of KnowBe4's ideas on seeing in to third-party risk.
Dave Bittner: [00:01:36] And we are back. Ben, why don't you kick things off for us this week?
Ben Yelin: [00:01:40] Sure. So my article comes from the technology columnist at The Washington Post. He had one of his readers write in to talk about an issue with HIPAA, the Health Insurance Portability and Accountability Act. One of his readers was using a third-party prescription portal. So the portal, called, Follow My Health, is a portal that the patient can access. It's not controlled by their medical provider. The provider contracts with a third party. And that's where that person can refill prescriptions, get records of blood tests, et cetera, et cetera.
Dave Bittner: [00:02:12] Right. And these are popular. I think my general practitioner uses one of these. And it's certainly convenient. You can log in and get your latest test results or, you know, send a message to your doctor. Things like that.
Ben Yelin: [00:02:24] Absolutely. And, you know, I go in those portals, I worry about my cholesterol reading.
Dave Bittner: [00:02:29] (Laughter).
Ben Yelin: [00:02:29] It's right there for me to see...
Dave Bittner: [00:02:30] (Laughter) Right.
Ben Yelin: [00:02:30] ...And then I can go about trying to make changes to my diet. Yeah. I think most of us, through our medical provider, have access to this.
Dave Bittner: [00:02:37] OK.
Ben Yelin: [00:02:38] So HIPAA only applies to covered entities. That's sort of the language in the statute. And obviously, a covered entity is going to be any medical provider. That also includes business associates of those medical entities. And that language is sort of vague enough that it can be interpreted multiple ways. The reason that this is a concern as it relates to this article is the reader was worried because she read the fine print, - and credit to her - and Follow My Health said that they reserve the right to release information for advertising purposes from this portal. And the reader, with her eagle eye, noted that this seemingly on its face should seem like a HIPAA violation.
Ben Yelin: [00:03:19] So they interviewed an expert from the civil rights office of the U.S. Department of Health and Human Services, HHS. HHS said because Follow My Health is a business associate of the provider that they count as a covered entity under HIPAA, therefore they would not be able to release this reader's information to third-party advertisers. The third-party representative disagreed and actually claimed that his site wasn't covered under HIPAA, that the business associate provision applies very narrowly, only to people who sort of set up the interface.
Ben Yelin: [00:03:55] So, like, if you're working with a provider, and you're the IT guy, and you're the one setting a user interface for that particular provider, then it would apply. When we're talking about these third-party sites, at least this individual, who works for the parent company of Follow My Health, disagrees and said HIPAA does not apply. Now, he tried to claim, we're not selling your information to advertisers. All we intend to use this for is to give you, the patient, more information about products that might help with your medical condition.
Dave Bittner: [00:04:23] (Laughter) That sounds to me like a distinction without a difference.
Ben Yelin: [00:04:26] It sure does.
Dave Bittner: [00:04:26] (Laughter).
Ben Yelin: [00:04:26] It would certainly make me wary.
Dave Bittner: [00:04:27] (Laughter) I could connect some dots here between this and the Cambridge Analytica problem that Facebook had.
Ben Yelin: [00:04:34] Absolutely. Yeah. I mean, I think it's the same sort of thing. It's sort of a, we have the purpose of the legislation, which is to protect patient privacy, and then we have this very specific language that the legislation only applies to covered entities. And so it's all about how you can define yourself away from those covered entities. 'Cause again, this is a very lucrative business opportunity for that third-party company. I can't think of anything more valuable to a lot of advertisers than personal medical data.
Dave Bittner: [00:05:04] Right.
Ben Yelin: [00:05:04] If you find the people who have heart disease - you find the people who are suffering from hair loss, you know, you send them the Propecia advertisement...
Dave Bittner: [00:05:12] Right.
Ben Yelin: [00:05:12] And Propecia's going to be thrilled about getting access to it. So I think it's almost a potential loophole in the law where clarifying legislation would be useful, something that says, the business associate relationship extends to these third-party portals where users can go in, get access to their own medical records.
Dave Bittner: [00:05:30] Just because this person from this third-party provider says that this is their interpretation of it, I mean, is it still possible that the good folks at HHS are going to get on the phone and have a friendly conversation with them?
Ben Yelin: [00:05:43] I think that's very possible. Obviously, you'd think that the HHS interpretation is more compelling. Now, nobody, prior to this article being released, had stopped Follow My Health, the provider, from having this provision in their terms of service. And the reader noted very astutely that there was no option in the interface to opt out of having your information potentially given to advertisers. Follow My Health - I think probably because of the publicity they're going to get through this article - says they're going to make a change to their privacy policies by the end of the year where you can opt out to having your information provided to these third parties.
Ben Yelin: [00:06:21] But to your point, until somebody stops them, because it's so lucrative, I think they will continue. And I think HHS could try with some enforcement action, some administrative decision, some rule or regulation, to curtail this. But HHS has its hands full. They have a lot of problems in the health system more generally that they have to deal with. And so making any sort of technical or clarifying change to a statute is always going to be sort of a big administrative burden.
Dave Bittner: [00:06:48] Now, is it possible that, for example, the person who wrote in with this concern, could they file suit against the company and say I believe you're violating HIPAA here?
Ben Yelin: [00:06:58] They could. So you'd have to have standing to file a suit. And to establish standing, you'd have to have suffered some sort of injury in fact. I'm curious as to what that potential injury would be. I mean, you'd have to use your imagination. It'd have to be something really tangible. Could an injury be that you received an advertisement? You know, I'm not sure that would qualify as an injury for the purpose of standing.
Dave Bittner: [00:07:20] Mmm hmm.
Ben Yelin: [00:07:21] If you could think of something more tangible - and perhaps there is something out there...
Dave Bittner: [00:07:25] Right.
Ben Yelin: [00:07:25] ...Then I think you could have a good case that the statute's being violated. But standing is always - it's like the ticket to get into the courtroom. And it's always tough to get ahold of that ticket, especially when you can't really prove that you've suffered, like, a particularized injury.
Dave Bittner: [00:07:39] The thing that bugs me about this is, to me, this is one of those letter-of-the-law versus spirit-of-the-law kind of things. I mean, I think we can all agree without really any ambiguity as to what HIPAA is meant to do.
Ben Yelin: [00:07:51] Absolutely.
Dave Bittner: [00:07:52] I would even bet that the doctors who are using this third-party portal probably are unaware that this is going on.
Ben Yelin: [00:07:58] Right. So the medical system quoted in this article actually didn't respond to questions about it. I would guess that most providers probably are not aware that their patient portals potentially would be selling information to third-party advertisers. And medical providers take HIPAA very, very seriously.
Dave Bittner: [00:08:15] Right.
Ben Yelin: [00:08:15] HIPAA compliance, you know, as somebody who is married to a medical professional...
Dave Bittner: [00:08:20] Ah.
Ben Yelin: [00:08:20] ...I know how cumbersome it can be to comply. I know how rigid the rules and regulations are. Sanctions to medical providers from HIPAA, that's serious business. The spirit of the law, you're right, seems very clear. We want to protect personal health information. I understand why it only applies to covered entities. Because if it didn't apply to only covered entities, it would be sort of a slippery slope. Like, what if you and I had a casual conversation, and I was like, I've been having back problems recently, and you went and told somebody else? Like, you shouldn't be liable for a lawsuit. But when we're talking about a portal that has a lot of potentially very sensitive personal data, I think we can agree that giving advertisers access to that could potentially be a violation of the spirit of that law.
Dave Bittner: [00:09:07] Yeah. That serving is one of the primary interfaces I have between my doctor and myself. This seems like a no-brainer to me, but I never fail to be amazed by how folks find ways to put our information in front of advertisers online. So - (laughter).
Ben Yelin: [00:09:23] Absolutely. And like I said, I mean, you understand how valuable this could be to potential advertisers.
Dave Bittner: [00:09:28] Sure.
Ben Yelin: [00:09:28] Because the pharmaceutical industry is very lucrative. They're powerful. And the more information they can get about the specific conditions of patients, you can understand why that would be a big moneymaker for them. They're always looking to find advertising targets. It's understandable, from their perspective.
Dave Bittner: [00:09:45] Yeah. Well, the article's by Geoffrey Fowler over at The Washington Post - worth a look. We'll include a link in our show notes here. My story this week is about an Oregon judge - this is from The Oregonian - an Oregon judge who ordered a woman to type in her iPhone passcode so police could search it for evidence against her. Now, you and I have spoken on the CyberWire, over the segments we do over there, many times about all of the things about being compelled to unlock your phone. Let me let you unpack it as to what the motivations are, what the justifications are, behind this.
Ben Yelin: [00:10:19] Sure. So the Fifth Amendment to the Constitution says that a person cannot be compelled to incriminate him or herself.
Dave Bittner: [00:10:25] Right.
Ben Yelin: [00:10:26] So if you know that there's incriminating info on your iPhone - let's say you have a text message saying, I did it. (Laughter) Or, you know, a picture above the...
Dave Bittner: [00:10:33] Right - can't wait to get home and murder my wife.
Ben Yelin: [00:10:36] Exactly.
Dave Bittner: [00:10:37] (Laughter).
Ben Yelin: [00:10:37] Or, you know, pictures, videos you took of you committing a crime.
Dave Bittner: [00:10:40] Right.
Ben Yelin: [00:10:41] You're going to be very reluctant to unlock that passcode.
Dave Bittner: [00:10:44] Yeah.
Ben Yelin: [00:10:44] And thankfully for users of these devices, the passcodes are pretty difficult to crack. We all remember that dispute between Apple and FBI. There are a limited number of attempts before the phone shuts down. And, you know, the FBI was trying to get Apple to break into their own encryption software. So you can understand how difficult it would be for your garden-variety law enforcement agency. And you can understand why this potentially is your clear-cut self-incrimination case. You type in your passcode. That phone is open. The police can go in there and find anything. And if they get evidence against you, they can use that at trial. The Fifth Amendment only applies to what's called testimonial evidence - so saying or writing something. That seemingly would apply here.
Ben Yelin: [00:11:27] Now, in this Oregon case, and in other cases across the country, the government has tried and has successfully used this justification that because you know your phone passcode and because you know that there are incriminating elements on your phone, access to that information is sort of a foregone conclusion. You're going to open your phone at some point in the future. The police and the criminal suspect know that incriminating information is on that phone. Therefore, law enforcement eventually will find a way to access it. You don't want to make them actually wait through the arduous process of waiting for somebody to enter that passcode, sitting with them 24/7 until they enter the passcode. So because we can say that it is somewhat of a foregone conclusion, that's sort of the justification that courts have used to compel people to unlock their phones.
Dave Bittner: [00:12:21] Hmm. OK.
Ben Yelin: [00:12:22] I happen to think it's a bit of a legal fiction. I don't think it really is always a foregone conclusion. Because of course, if the evidence is really incriminating, someone might be inclined to get rid of that device completely.
Dave Bittner: [00:12:34] Right. I throw it on the ground and stomp on it.
Ben Yelin: [00:12:37] Right.
Dave Bittner: [00:12:38] Problem solved. I guess the judge could take issue with that.
Ben Yelin: [00:12:41] Exactly. Now, there's been disagreement among the judicial circuits in the country about this. You know I understand the law enforcement practicality. They want access to this information as quickly as possible. You can hardly think of any better testimonial evidence than something that's contained in an iPhone with every single application that we use on a daily basis. These are going to be treasure troves of evidence. But again, I just think it's - to sort of state the presumption that whatever incriminating information is on the device is going to eventually be accessible to law enforcement simply because the criminal suspect has the passcode and simply because law enforcement knows that something incriminating is on that device seems, to me, to be a little bit far-fetched.
Ben Yelin: [00:13:27] And you can try to analogize it to the nondigital world and say, I can compel your testimony or your admission in this police interrogation. I know that you know incriminating information in your brain. I know that I'm sitting here in the room with you. So it's a foregone conclusion that you're eventually going to tell me. That seems to be your clear-cut violation of the Fifth Amendment, right, against self-incrimination. That's why we have Miranda warnings so that people know they have that right against self-incrimination. But how is that necessarily practically different than what we're talking about here, where if you reveal that passcode, you are potentially opening yourself up to criminal charges?
Dave Bittner: [00:14:09] Is this something that is on an inevitable course toward the Supreme Court?
Ben Yelin: [00:14:15] I tend to think so just because we have that split among judicial circuits. This is a Oregon Supreme Court case. So it was a claim based on both the federal Constitution - so the Fifth Amendment - and also the Oregon Constitution has a separate, similar provision against protecting individuals against self-incrimination. So it's less likely that the case would come from a state court. Usually, you get cases that go through the federal circuits. And because this is potentially a claim under federal law, you've seen a bunch of cases - I believe we talked about one in Florida - where these cases have made it into federal court. And just to sort of reiterate why the Fifth Amendment's right against self-incrimination is so important, you can sort of conceptualize what would happen if we didn't have that right.
Ben Yelin: [00:15:01] A criminal suspect would be in an impossible position because you either are forced to testify where, in which case, you're probably admitting that you committed a crime, you stay silent, in which case you would be held in contempt of court, or you lie, in which case, you could be convicted of perjury. So it's putting a criminal defendant in an impossible position. And if law enforcement and the individual are so certain that there's going to be incriminating information on that device then to me it doesn't seem any different than your standard testimonial evidence cases. But that's what makes, you know, the digital world so interesting, is you often see these slight deviations on long-standing judicial precedents.
Dave Bittner: [00:15:43] And in this case, the circuit judge ordered the defendant to enter her password. And she put in the wrong password, and the judge found her in contempt of court and sentenced her to 30 days in jail.
Ben Yelin: [00:15:57] Yeah. So obviously, that pales in comparison to the 11 years she was eventually sentenced for the underlying crime. She tried to punch in one, two, three, four, five, six, which...
Dave Bittner: [00:16:07] As I was reading this, there was part of me that was hoping that was actually her password.
Ben Yelin: [00:16:10] I know.
Dave Bittner: [00:16:11] (Laughter) It was that easy all along.
Ben Yelin: [00:16:13] I know.
Dave Bittner: [00:16:15] That's not how it played out, though.
Ben Yelin: [00:16:16] No.
Dave Bittner: [00:16:16] (Laughter).
Ben Yelin: [00:16:16] No. No. I feel like maybe she could be more creative in her fake passwords. 'Cause it's so obvious when you type in one, two, three, four, five, six that you're obstructing justice. So yeah. I mean, because the court ruled the way it did, she was compelled to unlock that device. And once you're compelled, you either do it, in which case, you're incriminating yourself, or you can be held in contempt of court and can go to prison. And that also can be used as evidence against you in future proceedings, that you were trying to conceal incriminating information. And so, you know, the court really is putting that criminal defendant in an impossible position. I guess the one solution for somebody who's thinking of committing crimes is to not hide any evidence on your personal device because it seems like more and more courts are willing to use this inevitability doctrine to compel disclosure.
Dave Bittner: [00:17:10] All right. Well, it's an interesting case. I'm really curious to see what happens if this does make its way to the Supreme Court, one that leaves me scratching my head many times as we see it swing in different directions depending on which state is taking a look at it.
Ben Yelin: [00:17:26] I agree. And I think it's definitely the type of issue, like I said, because we have that split among circuits, that you can really see making it up to the Supreme Court. And it would be a groundbreaking Fifth Amendment case.
Dave Bittner: [00:17:37] All right. Well, it's time to move on to our Listener on the Line.
0:17:40:(SOUNDBITE OF PHONE DIALING)
Dave Bittner: [00:17:45] This week, we've got a call from a gentleman named Kevin (ph), and he has a question about two-party consent. Let's listen to the call.
Kevin: [00:17:53] Hey, David. Kevin. I was just calling in regards to the two-party consent laws, things like my Alexas that I have in each of my rooms. If Alexa's always listening, would I need to ask everyone who comes to my house if I can have their permission to keep the Alexas plugged in?
Dave Bittner: [00:18:12] So this is an interesting one because you and I both live in Maryland, which is one of these two-party consent states. What do you make of this?
Ben Yelin: [00:18:20] So it's a fascinating question. I have an Alexa device myself, and it's not something I've ever really considered. There are a couple of interesting points here. One is it counts as informing the other party if there is some sort of implied consent. So if you were to give, like, a recorded warning at the beginning of a phone conversation that this conversation may be recorded for business purposes or whatever, that would satisfy the requirement of informing the other party.
Dave Bittner: [00:18:47] OK.
Ben Yelin: [00:18:48] My read of this is, if you go into somebody's house and see the Alexa device, that is going to be the implied warning. Or if you don't see the device and a person activates that Alexa device by being like, hey, Alexa, play me some smooth jazz...
Dave Bittner: [00:19:03] Right.
Ben Yelin: [00:19:03] ...Then that also would be the implied warning to the other person that there is an Alexa device. And in that case, therefore, you wouldn't be required to give your, hi, welcome to my house. Before you say anything, I'd like to warn you that my Alexa is potentially listening.
Dave Bittner: [00:19:19] (Laughter) Right. Right. Like the old legend about vampires - that they won't come in your house unless you invite them in. You have to invite them over the threshold. Like, before you have any guests come in your house, you have to give them a disclaimer that...
Ben Yelin: [00:19:29] Exactly. Before they say hi, excuse me - yeah.
Dave Bittner: [00:19:31] (Laughter) Right. Before you enter my house, I just need to say that your - the implied consent is that there are many, many devices that could be listening and recording to you. By crossing this threshold, you grant permission to me and the members of my family to do so (laughter).
Ben Yelin: [00:19:42] Exactly. Now...
Dave Bittner: [00:19:44] That's a great way to welcome someone to a dinner party (laughter).
Ben Yelin: [00:19:47] It is. If you want to lose friends, I think the best way to do it is to say that, my Alexa device might be listening to you. There are a couple of other interesting elements here. One is that, under the Maryland statute, there are a bunch of exceptions. If law enforcement needs to use audio communications, any type of audio communications as evidence in a criminal prosecution - and they list the crimes that are subject to that provision, and it's all the most serious crimes you could think of.
Dave Bittner: [00:20:12] OK.
Ben Yelin: [00:20:13] So murder, kidnapping, sexual assault, et cetera, et cetera - child abuse. So whether you give consent or not, law enforcement is always going to be able to access those audio communications if they have probable cause to believe that there's evidence of a crime contained therein. So there's that aspect of it. That leaves us this very narrow sliver of crimes that aren't covered under that provision that somebody would admit to in somebody else's house, where the Alexa device would actually pick it up because they're only listening the few seconds before and the few seconds after you're using that trigger word.
Dave Bittner: [00:20:49] Right.
Ben Yelin: [00:20:50] So I think it's, like - you'd have to have - it's such an attenuated scenario that I can't envision any actual case coming from this, although we've been surprised in the past.
Dave Bittner: [00:21:00] Well, but I...
Ben Yelin: [00:21:00] So I'd never say never.
Dave Bittner: [00:21:01] Yeah. What I'm wondering, though, is - because I think one of the things that our listeners are wondering about is the notion that the recording itself is a crime; that if you and I are somewhere not in public and I'm recording you without your permission or without your knowledge, that itself is a crime because I don't have your consent.
Ben Yelin: [00:21:21] I think according to the letter of the law, at least the way I read it, as it applies to these devices, it would potentially be a violation of the two-party consent statute. The reason I'm hesitant about it is because of this implied consent provision, especially now that we're in an era where many people have smart home devices. There might sort of be an expectation that when you go into somebody's house, you know, there's going to be some sort of listening device, whether it's an Alexa or something else.
Dave Bittner: [00:21:49] Does this mean - yeah, it's illegal for me to have a nanny cam or a teddy bear with a video camera in its eyeball...
Ben Yelin: [00:21:56] Right.
Dave Bittner: [00:21:56] ...So I can keep an eye on my child care provider or something like that.
Ben Yelin: [00:22:00] Yeah. I mean, that's another very interesting potential test case. I think if there's something visible in your house, that would satisfy the requirement of two-party consent because, again, you don't actually have to say something formal like, you are being recorded; welcome to my house.
Dave Bittner: [00:22:16] Right.
Ben Yelin: [00:22:17] It can be something a little more implicit than that. And I think, you know, in most of our houses, it's pretty easily discoverable whether you have one of these devices. And therefore, I sort of think it's something that courts probably wouldn't allow into their courtroom - a claim based on this.
Dave Bittner: [00:22:33] And this process of figuring these things out is, by its nature, reactive, right? In other words, something would have to happen where someone would come through and say, hey, this law is still on the books, and these - all these new things have happened. We have all these new devices. And so now we're going to come and test this case?
Ben Yelin: [00:22:51] Absolutely. And very frequently, you'll get civil liberties groups who will want to take these cases, find the most compelling plaintiffs because they want some sort of legal precedence on this.
Dave Bittner: [00:23:01] Looking for clarity.
Ben Yelin: [00:23:02] Exactly.
Dave Bittner: [00:23:03] Yeah.
Ben Yelin: [00:23:03] To my knowledge, I haven't seen any cases on this in Maryland. But there's always going to be a first time. And as I say, you know, it could be a very narrow scenario where there potentially is evidence of a crime that doesn't fall under the list of exceptions to the two-party consent law. Law enforcement may try and gain access to it. Somebody could make a claim that they didn't consent as part of our two-party consent statute to being recorded, and then, you know, at least, in theory, I could see there being a claim there. But, you know, that's a lot of things that would have to happen before we got some sort of definitive ruling on whether you have to give a warning that there is an Alexa device in your house.
Dave Bittner: [00:23:41] Interesting.
Ben Yelin: [00:23:42] So I would suggest next time you have a dinner party, hold off on that subject.
Dave Bittner: [00:23:46] Right.
Ben Yelin: [00:23:47] Talk about something else.
Dave Bittner: [00:23:48] Yeah. Maybe all these devices, they just start coming with stickers that you can put on your front door - you know, this house is equipped with Alexa (laughter).
Ben Yelin: [00:23:56] You know, that's a great point. I think eventually, if we ever did have some sort of judicial clarity on this - all of those signs exist, in large part, because of two-party consent laws.
Dave Bittner: [00:24:07] Right.
Ben Yelin: [00:24:07] Your voice may be recorded.
Dave Bittner: [00:24:09] Yeah.
Ben Yelin: [00:24:09] Whether that's an audio warning or some sort of verbal warning when you walk in somewhere, it's because they don't want to be held liable in either criminal or civil court for violating that two-party consent statute.
Dave Bittner: [00:24:20] Yeah.
Ben Yelin: [00:24:20] So you know, if we determine that applies to our smart home devices then, you know, maybe in that Amazon Echo box, you'll get that little sticker that you put in front of your house...
Dave Bittner: [00:24:32] Right. Right.
Ben Yelin: [00:24:33] ...That gives everybody fair warning.
Dave Bittner: [00:24:35] And I think it's worth mentioning also that you and I - you know, we're talking about this from the point of view of being in Maryland. Maryland happens to be both - it's where we live, and it's also a two-party consent state. But I think there's 11 other states, or a total of 11 - around a dozen states that are two-party consent laws. So this is not something where it's just a narrow number of people who have to deal with this sort of thing. It's a good amount of folks that are under these rules.
Ben Yelin: [00:25:01] Absolutely. Now, the corollary to that is that means there are 38 states that are not two-party consent laws.
Dave Bittner: [00:25:07] Right (laughter).
Ben Yelin: [00:25:08] Which has always been fascinating to me. That's why you'll often see these sort of hidden videos trying to expose, like, opposing political groups, where they'll walk in with hidden recording devices. They'll record you. You can post that online to make somebody else look bad. I mean, it's a - you know, there are certainly groups that do that. They will do their research and figure out which states are two-party consent states, and they'll know it's much easier to do this, that sort of video surveillance in a state that has one-party consent...
Dave Bittner: [00:25:37] Yeah.
Ben Yelin: [00:25:37] ...Only. So I'm always sort of surprised that it hasn't caught on in the other 38 states just because it seems intuitive to us that both parties should have permission for there to be a recording. But I guess Maryland is ahead of the curve on this. So in that sense, maybe we're lucky to live here.
Dave Bittner: [00:25:53] (Laughter) That's right. Well, we want to thank our listener, Kevin, for calling in. And we would love to hear from you. If you have a question for us, you can call in. The number is 410-618-3720. That's 410-618-3720. You can also send in an audio file of your question. You can send that to email@example.com. Just give us your first name, where you're from and your question, and there's a chance we will use it on the air.
Dave Bittner: [00:26:22] Coming up next, we've got my interview with Ray Walsh. He's from Proprivacy.com. We're going to be discussing some new legislation in California that deals with manipulated video and deepfakes.
Dave Bittner: [00:26:32] But first a word from our sponsors, KnowBe4. So let's return to our sponsor, KnowBe4's, question. How can you see risk coming? Especially when that risk comes from third parties? After all, it's not your risk - until it is. Here's step one. Know what those third parties are up to. KnowBe4 has a full GRC platform that helps you do just that. It's called KCM, and it's vendor risk-management module gives you the insight into your suppliers that you need to be able to assess and manage the risks they might carry with them into your organization. With KnowBe4's KCM, you can vet, manage and monitor your third-party vendor's security risk requirements. You'll not only be able to pre-qualify the risk, you'll be able to keep track of that risk as your business relationship evolves. KnowBe4's standard templates are easy to use, and they give you a consistent, equitable way of understanding risk across your entire supply chain. And as always, you get this in an effectively automated platform that you'll see in a single pane of glass. You'll manage risk twice as fast at half the cost. Go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. Check it out.
Dave Bittner: [00:27:54] And we are back. Ben, California recently passed a couple of bits of legislation involving deepfakes and security and misinformation around elections. I checked in with Ray Walsh - he's from Proprivacy.com - to get the details. Here's my conversation with Ray Walsh.
Ray Walsh: [00:28:15] California's passed two laws, AB-730 and AB-602, and both of those laws are designed to help legislators to deal with deepfake technology. The first one, AB-730, is basically designed to make it illegal for anybody to distribute any manipulated videos that aim to discredit political candidates and deceive voters. And that law applies within 60 days of an election. Obviously, there's a lot of concern surrounding the possibility of using deepfakes during election times. And so that law is specifically designed to look at that aspect of things. The second one, AB-602, actually gives Californians the right to sue anybody who makes deepfakes that put their likeness into any pornographic material.
Ray Walsh: [00:28:59] So obviously, there's been a lot of pornographic content appearing online that appears to be celebrities, and actually, it's just some random footage that's been taken and that's just had somebody's face placed onto it. And there's growing concern that there could be sort of revenge-porn attacks and other targeted attacks on members of the general public who end up having pornographic circulated of them which they obviously never actually took part in. So this law is just designed to ensure that if that does happen, people can actually go ahead, get the police to look into it and then possibly even seek damages from whoever spreads that material.
Dave Bittner: [00:29:39] Well, let's go through each of these one at a time, starting with AB-730, which is the one about deepfakes close to an election. Why only 60 days within an election? Why constrain it to that time period?
Ray Walsh: [00:29:53] Sure. I mean, that's a very interesting question. And some people would argue that there's no reason why it couldn't be, actually, for a longer period of time. I think one of the main reasons that they've kept it quite strict like that is that they understand that sometimes when these deepfakes appear, it can be quite hard to figure out whether they're actually fake or whether they're real. And so they need to have a little bit of time to be able to figure that out. Now, in a normal month, when an election wasn't about to happen, footage could be disseminated, and it could be spread quite well, and it wouldn't necessarily matter as much if few weeks down the line it actually was made obvious and it was all over the press that that footage was fake. You know, there was that footage of Nancy Pelosi where it had been doctored and slowed down to make her seem drunk. And at first, that spread, really, quite all over the internet, quite viral. And it was soon proven that it was fake. And obviously, even though it damages her reputation at first, as long as people find out it is fake later on, it doesn't really have such a damaging effect.
Ray Walsh: [00:30:53] So I think the main aim here is to still sort of allow people to have freedom of speech and allow people to have the power to spread things when it might have, you know, a reasonable reason for happening. Sometimes a politician might have - I'm not exactly sure. When it comes to politicians, I'll be honest. I'm not exactly sure why you could make a fake bit of footage that would be of interest. But I think the idea is that within 60 days of an election is when it's a most sensitive time and when, if there was any delay in managing to prove that footage was fake, that it would have the most damaging effect on society. The most important thing we can talk about, I think, is the fact that in many respects, this legislation, even though it's really forward thinking and we can all say that it's a really good idea, and we can already hope that, you know, Congress and Senate and the House is going to think about doing this on a federal level for the whole country, passing something that would affect all states, I think that the main problem that we've got is that the legislation can only do so much because when you make a deepfake video, and it disseminates online, it can be very hard to trace that back to any one person.
Ray Walsh: [00:31:57] So it's all well and good having a law that says whoever distributes this material is breaking the law if they do it within 60 days of an election. But unless you can pinpoint who did it, it's going to be very hard to actually bring any charges against that person, anyway. And in the meantime, that footage is going to go viral all over the media. People are going to, you know, have the right to freedom of speech that's protected in the Constitution to actually spread that material up until the point when it is proven to be fake. Because until we know that footage isn't real, it could potentially be real. And let's say there's footage of a politician making the rounds that appears to make it seem like it would be a very bad idea to vote for them. Until you know that that's definitely not real, you've made sure that you do want to share that footage with people.
Ray Walsh: [00:32:41] See what I mean? I mean, there's always going to be that two sides of the coin where it's, on the one hand, people will have to make a moral decision as to whether we want to share footage. And on the other hand, people will also perhaps feel that they have a duty and an obligation to spread that footage until it is proven fake because potentially - because let's say the election's tomorrow and something's disseminated today, and it is very damaging to that candidate. Perhaps people feel that they really don't want to vote for them.
Dave Bittner: [00:33:07] Yeah. It also - I mean, it strikes me, to your point, that what if the deepfake is created by someone from another nation? It comes offshore. The long arm of the law is going to have trouble tracking down someone who's on the other side of the planet.
Ray Walsh: [00:33:22] Exactly. Let's go back to the previous election, when Guccifer 2.0 actually was said to have hacked documents with the DNC and published them online, and they went to WikiLeaks. Now, when that happened, later it was proven that a lot of the material that Guccifer 2.0 published and released by WikiLeaks was actually an amalgamation of fake documents mixed with some stuff that he hacked - he or she hacked - and a lot of it was proven to be disinformation. Now, that is one method of performing, shall we call, social engineering, on an election, where you hack people's state of mind by spreading disinformation that makes a candidate look bad. And as far as I'm concerned, deepfakes is exactly the same sort of thing. You can create a video, but it's just a much more extreme version of it. You know, you would create a video of somebody appearing somewhere where they shouldn't be, saying things that they shouldn't be saying or doing things that they shouldn't be doing. And then that spreads all over the internet, and people get a sense that they shouldn't vote for that person now. That's exactly the same thing as what Guccifer did.
Ray Walsh: [00:34:31] And most cybersecurity experts that looked into that case laid the blame on Russian hackers, and whether that will ever be proven 100% remains to be seen. But what we do know is that it definitely sets a precedent where we know, like you say, foreign state actors or lone wolf hackers-type people could potentially distribute this kind of footage on purpose to ruin a candidate's reputation. And under those circumstances, it would be very hard to, A, figure out who did it and, B, to do anything about it even if you did figure out who did it.
Dave Bittner: [00:35:02] Well, let's talk a little bit about the other law, which gives Californians the right to sue someone who creates deepfakes and places them in pornographic material without their consent. I suppose on - my initial reaction to this is I'm surprised that this didn't already exist, that I had the ability to go after someone. Is this a matter of just placing some specificity on particular action that people can take?
Ray Walsh: [00:35:27] Yeah, I think that's exactly it. So obviously as new technologies arise, it's always down to legislators to catch up with the new technologies that's being released, and unfortunately, it always works that way with tech; that first of all, technology appears, and then legislators rush to catch up and create laws that can help people to protect themselves if and when they're used for nefarious reasons. And with deepfakes, you know, we've already seen laws being passed for people to be able to get the law involved when there's revenge porn attacks and that sort of thing.
Ray Walsh: [00:35:57] And now the laws are just slowly starting to go through and be made for the potential for deepfakes to be used. And actually I think - in the wild, shall we call it, really hasn't been affecting the general public that much up to now because even though there are open-source versions of the deepfake-making software available online, it's not really gone into, you know, most people's hands. And so we aren't seeing a lot of cases of people being affected by this kind of fake pornography. But obviously, as time passes and it starts to become more of a danger, it's good to have those laws in place. And I think California is just leading the way by passing that law and protecting people not just against revenge porn but this very specific problem.
Dave Bittner: [00:36:42] With California leading the way, as you say, do you suppose that other states will follow, and how likely is it that we'll see action at the federal level?
Ray Walsh: [00:36:51] Obviously, with California, it's already got a reputation for going first. The CCPA is going to be coming in in January, and that has actually already inspired other states to follow suit and start creating privacy protections for people's data. So I think we can use that as a - the California Consumer Privacy Act as a sort of example of how this may well spread to other states. And I think we can rest assured that other states will follow suit. And whether Congress, you know, start to - do decide to propose a bill into the Senate is - you know, we have to wait and see. But it certainly seems like it's something that should be tackled on a nationwide level, and so we would hope that that would be something that happens.
Dave Bittner: [00:37:29] Does California's lead on this become kind of a default for other states? I guess because of the fluid nature of online communications and how, you know, online communications don't necessarily respect state lines, when something like this goes into place, does it sort of become a de facto standard?
Ray Walsh: [00:37:50] Sure. I mean, the question there would have to boil down to where the perpetrator was, I think. Whether a law in California would make it so that in another state they could be arrested remains to be seen. I'm not sure on that point because I'm not actually a lawyer. But obviously, you know, that's what I would question is, if they're in California and they do it in their court, then obviously the law is there in order to arrest them. But if they are in another state, it doesn't necessarily hold true.
Ray Walsh: [00:38:19] And so, you know, whenever there is state-level laws, they apply to that state. It doesn't necessarily - there are privacy laws such as, you know, for webcam, the use of webcams in a home, and they vary from state to state. And I know that in some states, you're allowed to use them more than you are in others. So you do have to look at the state level. And you also have to be aware that any potential federal law that was brought in for the whole country, it can actually supersede state laws.
Ray Walsh: [00:38:46] So that's why whenever there is talk of the government's posing a federal law, there's always the danger that a stronger state-level law could actually be superseded. And that can be dangerous sometimes because sometimes states like California actually impose a law that's quite strong, and unless the federal law that's put in place actually meets those standards, there's always the potential they could bring the standards back down.
Dave Bittner: [00:39:08] So interesting stuff going on in California - what do you make of this, Ben?
Ben Yelin: [00:39:12] Yeah. So my home - native home state of California always leading the way when it comes to digital privacy.
Dave Bittner: [00:39:18] (Laughter).
Ben Yelin: [00:39:18] And kudos to them on that. My take is the legislation is well-intended. It's designed to address a real problem. Deepfakes are a problem both in the political realm and when we're talking about pornography. So I think the effort is laudable. There are some limiting factors in the legislation, some of which Ray noted, that will, I think, limit the effectiveness of the legislation's intended goals. So the first one that jumps out to me is, for the legislation related to political candidates, the standard is that only people who distribute those deepfakes with, quote, "actual malice" will be subject to civil penalties.
Ben Yelin: [00:39:59] Now, actual malice is a - what we call a legal term of art. That comes from a famous case called New York Times v. Sullivan. And what it means is a person acts with actual malice if they distribute something either knowing that the information is false and deceptive or if they had very good reasons to know the information is false and deceptive. The reason this seems like a limiting factor to me is when Grandma and Grandpa on Facebook share a video of Nancy Pelosi saying something that she very didn't obviously say...
Dave Bittner: [00:40:33] Right.
Ben Yelin: [00:40:33] ...It's unlikely that they're going to know that what they're distributing is fake. That's the nature of material travelling on the internet virally.
Dave Bittner: [00:40:41] Right.
Ben Yelin: [00:40:42] Most of the people sharing it are just not literate enough in technology that they could potentially recognize a deepfake.
Dave Bittner: [00:40:48] Yeah.
Ben Yelin: [00:40:48] And so if we're not punishing them, then it would just be very hard to find a perpetrator where it would be easy to prove that person knew or should have known that that image was fake. It also means the better the deepfake, the more likely it is that people will think it's real, therefore the more unlikely it will be to overcome that actual malice standard. So that's one point. The other - and I asked you just while we were listening to this if you knew what the Streisand effect was.
Dave Bittner: [00:41:14] (Laughter) Yes.
Ben Yelin: [00:41:16] For those of you who are unaccustomed, it was a lawsuit that Barbra Streisand instituted against her neighbors for - do you know the full details of the story? I'm botching it.
Dave Bittner: [00:41:24] I don't. I just know what the - basically, the Streisand effect is that you draw attention to something that probably wouldn't have received a lot of attention by going after someone.
Ben Yelin: [00:41:36] Exactly. I should do more research on the origin story of this.
Dave Bittner: [00:41:39] (Laughter) Right.
Ben Yelin: [00:41:39] But that's exactly what the principle is. Now, the political law gives any political candidate or anybody else, any voter, a cause of action to seek an injunction against deepfakes within 60 days of an election. So let's say you're a, you know, relatively unknown state assembly candidate in California, and somebody produces a deepfake about you, and maybe you've made some enemies in the past. By instigating that lawsuit, that deepfake is going to be on the news. That deepfake is going to be something that your opponents can seize on. It's going to be something that's going to be more likely to go viral. Ooh, there's a deepfake lawsuit. You know, let's share that story.
Dave Bittner: [00:42:17] Right.
Ben Yelin: [00:42:17] And embedded within that story is going to be the deepfake itself. You know, as a candidate, you're probably better off just not giving any credence to it at all. Now, when we're talking about someone like Nancy Pelosi, that's a little bit different because she already has such notoriety as the speaker of the House.
Dave Bittner: [00:42:32] Yeah. Yeah.
Ben Yelin: [00:42:32] That it would be harder to stay anonymous. But really, by instigating a lawsuit (laughter), you'd be drawing a lot of attention to yourself.
Dave Bittner: [00:42:39] Well, you know, there's no such thing as bad publicity, Ben (laughter).
Ben Yelin: [00:42:42] Unless you're a political candidate. When you're a political candidate, yes, you know. We often see this with long shots in political primaries.
Dave Bittner: [00:42:49] (Laughter) Right.
Ben Yelin: [00:42:49] They'll try and say the craziest things so that they get attention.
Dave Bittner: [00:42:51] Right.
Ben Yelin: [00:42:52] But, you know, if you're in some, like, low-level, you know, city supervisor race or state legislature race, I think it's probably best if your name doesn't show up in the national headlines. The other thing I was going to - that stood out to me about the other law dealing with pornography is that there is an exception in the statute for something that has newsworthy or political value, and I'm just sort of wondering how that potentially could be interpreted, especially when we're talking about newsworthiness. Just the creation of the video itself could potentially be newsworthy.
Dave Bittner: [00:43:24] Somebody included that for a reason.
Ben Yelin: [00:43:26] Exactly.
Dave Bittner: [00:43:27] That exception didn't come out of nowhere. So someone thought about it.
Ben Yelin: [00:43:30] Right. Now, in both of these laws, there are provisions exempting media organizations who cover these stories from being subject to the legal requirements.
Dave Bittner: [00:43:40] OK.
Ben Yelin: [00:43:40] If you do a news story about the deepfake, then you're not going to be subject to civil penalties. You know, the other exception that I think could almost end up swallowing the rule - that I believe is in both statutes - is it doesn't apply to parodies. I mean, couldn't a reasonable defense, if you produced the deepfake, be that you intended it to be a parody and you didn't intend to represent it as the real words of the person being depicted? I mean, that's probably what my strategy would be. I mean, we understand why parody is protected under our First Amendment, just because it does add literary, political and newsworthy value to things.
Dave Bittner: [00:44:14] Yeah.
Ben Yelin: [00:44:14] But I think that's one of those exceptions that could end up swallowing the rule.
Dave Bittner: [00:44:18] All right - interesting thing to track. And as Ray pointed out and you as well - California leading the way when it comes to these things.
Ben Yelin: [00:44:24] Absolutely. They are always at the forefront, and I think Ray gave us a great overview of those two pieces of legislation. And yeah, I mean, if you have an interest in the next frontier in digital privacy, you should always be looking out West. And California has long been the leader, partially because Silicon Valley's there, and then just partially because they're generally a pretty progressive state on these matters.
Dave Bittner: [00:44:50] Yeah. All right. Well, our thanks to Ray Walsh - he's from proprivacy.com - for joining us. That is our show. We want to thank all of you for listening.
Dave Bittner: [00:44:57] And of course, we want to thank this week's sponsor, KnowBe4. Go to knowbe4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. You can request a demo and see how you can get audits done at half the cost in half the time.
Dave Bittner: [00:45:14] We want to thank the University of Maryland's Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com.
Dave Bittner: [00:45:23] Our "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: [00:45:37] And I'm Ben Yelin.
Dave Bittner: [00:45:38] Thanks for listening.