Caveat 11.25.20
Ep 55 | 11.25.20

Protecting the integrity of information and maintaining privacy.

Transcript

Laura Noren: We've now graduated to a place where we can have a fuller data record of a person than they would even be able to remember about themselves. So the digital has in some ways exceeded the human.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben has an update on Baltimore's spy plane. I wonder about the tension between free speech, disinformation and public health. And later in the show, my conversation with Laura Noren. She's an NYU visiting professor of data science and VP of Privacy and Trust at Obsidian Security. We're going to be discussing who governs the cloud and what data protection regulations are actually enforceable. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's jump in with some stories here today. What have you got for us this week? 

Ben Yelin: So I'm bringing you a story from Baltimore Brew, a source that we've actually used before to talk about this very subject, which is Baltimore City's Aerial Investigative Research program, AIR - clever acronym - which is colloquially known as the spy plane. So this is a pilot six-month program where the city deployed a unmanned aircraft in the sky that was taking real-time pictures as an effort to help solve what has been a very serious violent crime problem in Baltimore. 

Ben Yelin: A group of activists sued the city, saying that this was a violation of residents' Fourth Amendment rights against unreasonable searches and seizures. A district court - lower-level federal court - upheld the program, and it went to the 4th Circuit Court of Appeals, the appellate level in a federal court. 

Ben Yelin: And there was a decision that just came down, a 2-1 decision saying that this program passes constitutional muster. It is not a violation of the Fourth Amendment to have this airplane in the sky. So the reasoning is that you don't have a reasonable expectation of private when you are out in public. 

Ben Yelin: So I think we've talked about this before. This is a plane that takes real-time photos, can be narrowed down to a very small geographic area, but you can't take a picture of somebody when they're in their house or where they're in another private place. 

Dave Bittner: Right. 

Ben Yelin: And there's this long-held Fourth Amendment doctrine that if you are in public view, you've forfeited your reasonable expectation of privacy and, therefore, you don't have any Fourth Amendment protection. And that was the nature of the majority opinion here. 

Ben Yelin: The two of the three judges who are in the majority are appointees of President Ronald Reagan and President George H.W. Bush, so tend to be more on the conservative side. The dissenting judge is actually the chief justice of the 4th Circuit, originally a Bill Clinton appointee. 

Ben Yelin: And he made, I think, a very eloquent argument that we can't apply this plain view doctrine in the present circumstances, that our constitutional rights are of profound importance and a violation of the reasonable expectation of privacy is so pervasive when you have a camera taking real-time photos during all daytime hours across the entire downtown area of Baltimore. It's just so pervasive that you can't follow this rather narrow line of cases, which was intended to protect law enforcement if they saw somebody committing a crime in public or in plain sight. 

Ben Yelin: The dissenting justice also said that this is more akin to long-term surveillance than short-term surveillance. So the majority was saying that this isn't something that happens long-term. There are self-imposed limitations. The planes only fly during daytime hours. They're only capturing pixelated dots, rather than actual faces. 

Ben Yelin: The dissenting judge here says that this is actually long-term surveillance because you can, you know, track patterns over time within neighborhoods and also combine the data you're getting from the spy planes with other data that's available, so closed-circuit-camera televisions, security cameras, license plate readers and all the other sources we talk about on this podcast. 

Dave Bittner: Right. So, for example, if I leave my house and this camera, this eye in the sky tracks me leaving my house and I'm heading to the corner grocery store and I also walk by a police surveillance camera, a security camera, that security camera captures my face. Now, they align those two bits of information, and now they know every time from that point on, that little dot, that eye in the sky dot - well, they know who that is. That's me. 

Ben Yelin: That's you. 

Dave Bittner: Right. 

Ben Yelin: Yeah. 

Dave Bittner: Right. 

Ben Yelin: Yeah. I mean, it's a matter of putting two and two together. And it's basically what the dissenting judge here is saying is that it's too easy for law enforcement. 

Ben Yelin: You know, I think people have an understanding that when they go out in public, they might be captured by a security camera, they might be captured by - their car might be captured through a license plate reader, something like that. 

Ben Yelin: But most people don't have any expectation that they're being secretly monitored by an airplane in the sky 24/7, although I will note, and this is always a problem in these types of cases, the fact that Baltimore City has been so public about this program kind of does give people the reasonable expectation that they're being monitored 24/7. So this is sort of what I refer to as breaking the Fourth Amendment fourth wall. 

Dave Bittner: (Laughter). 

Ben Yelin: It's the government telling you, you know, what your reasonable expectation of privacy should be, which kind of short-circuits the constitutional process here. 

Dave Bittner: Wow. 

Ben Yelin: So where does this leave us? What is going to happen going forward? A couple things might happen. 

Ben Yelin: One, the people who brought the suit might appeal for the case to be heard en banc, which would mean the entire 4th Circuit Court of Appeals would hear this case. If you do a cursory look at the makeup of the court, a little more liberal-minded than this three-judge panel, which was chosen randomly. There are a lot of Barack Obama appointees on the 4th Circuit. So perhaps the case would turn out differently. 

Ben Yelin: More likely, in my opinion, is that the case is going to be rendered moot because in about a couple of weeks from when we're recording this, Baltimore City is going to have a new mayor. He is the current Baltimore City Council president, Brandon Scott, and he has said that he disfavors this program and he wants it discontinued despite the wave of crime that's happened in the city. So any other federal court might refuse to hear this case just because it's not a live issue anymore. The plane will have been grounded. 

Ben Yelin: I think one way or another, this isn't the final word on the Baltimore spy plane. I think just because it's been given a temporary constitutional stamp of approval doesn't mean that it's going to be pervasive in Baltimore City going forward. 

Dave Bittner: Yeah. One little nitpick, by the way - you mentioned at the top of the - of your discussion here that it's an unmanned plane. It is manned. It's like a Cessna, but it's just - it's loaded up with cameras and things. So someone is piloting it. 

Ben Yelin: Yes, piloting, yes. I should be clear. But they're not the ones taking the pictures. 

Dave Bittner: Correct, correct. 

Ben Yelin: Yes. 

Dave Bittner: Yes, yes. The photo gathering... 

Ben Yelin: But they are manned. 

Dave Bittner: ...Is automated, but there is someone behind the yoke of the plane itself, yeah. 

Ben Yelin: How boring would it be to just fly circles around downtown Baltimore City for an entire day? I could turn on this podcast. 

Dave Bittner: I know some pilots - that's right. Exactly. I know some pilots for whom that would be their dream job. I think, you know, just being able to be up in the air, for a lot of pilots, is enough. To get paid for it, boy, that would be great. Sign me up, you know? 

Ben Yelin: Yeah, that's a good point. 

Ben Yelin: One last thing I should mention is the majority opinion cited a op-ed written by a local Baltimore writer who writes for national publications, Alec MacGillis, who wrote about how the city of Baltimore was really collapsing and failing. He wrote this, I believe, in 2019. It was a groundbreaking piece that ran in The New York Times about Baltimore City's governance problems, crime problems. 

Ben Yelin: I thought it was really interesting that they - the majority cited that as a reason to have the spy plane - that the city of Baltimore is in such dire shape that we're perhaps willing to sacrifice some of our constitutional rights in order to protect our safety. So I thought it was really interesting that he invoked that piece. That certainly made the rounds around these parts when it was released. 

Dave Bittner: Yeah, yeah. That's interesting. All right, we'll have a link to the story. It's from Baltimore Brew, written by Fern Shen. We'll have a link to that in the show notes. 

Dave Bittner: Ben, my story this week - I'm going a little different way with it this week, and I hope you'll bear with me here and have a little patience with me. 

Ben Yelin: I don't know. I'll try. 

Dave Bittner: (Laughter) I know. I know. But I'm going to be relying on your expertise here. I'm trying to get a little bit of clarity in my own mind about how things work. So I'm counting on your knowledge of the Constitution and all those sorts of things. 

Dave Bittner: There's a tweet that's been making the rounds in the past week or so. It's from a woman whose name is Jodi Doering. I believe that's how you pronounce her last name. And she is a nurse in South Dakota. And, of course, South Dakota is getting hammered by COVID - just hammered. 

Ben Yelin: Yeah. 

Dave Bittner: And she had a series of tweets, and it reads like this. She says, (reading) I have the night off from the hospital. As I'm on my couch with my dog, I can't help but think of the COVID patients the last few days. The ones that stick out are those who still don't believe the virus is real, the ones who scream at you for a magic medicine and that Joe Biden is going to ruin the U.S., all while gasping for breath on 100% Vapotherm. They tell you there must be another reason they're sick. They call you names and ask why you have to wear all that stuff because they don't have COVID because it's not real. Yes, this really happens, and I can't stop thinking about it. 

Dave Bittner: I can't stop thinking about this tweet. 

Ben Yelin: Yeah. 

Dave Bittner: And I'm coming at it from the point of view of we talk about - a big topic throughout this election cycle has been disinformation. And what I'm curious about, what I want to pick your brain about is when disinformation intersects with a public health crisis, when we have folks who don't believe that the thing that is killing thousands of their fellow citizens every day is real, how do we combat that? And what are we allowed to do? 

Dave Bittner: The people who are spreading this disinformation - is it their constitutional right to do so? Are there any tools that we have to prevent them from doing so in the name of public health? 

Ben Yelin: So it's an excellent question. Let's take it from two different angles. We'll start with the government angle. So can the government arrest you and prosecute you for spreading false information that gets people hurt? 

Ben Yelin: I know that people love to bring up this old quote from a Supreme Court case that the First Amendment wouldn't allow you to shout fire in a crowded theater. 

Dave Bittner: Right. 

Ben Yelin: People bring that up all the time to justify restrictions on free speech. And that quote is almost always misplaced and out of context. 

Dave Bittner: Really? 

Ben Yelin: First of all, it's a Supreme Court case that was largely disfavored as the First Amendment jurisprudence developed. It was a case that tried to cut down on peaceful dissent during World War I. So, you know, it's not something that hardcore First Amendment advocates want to hang their hat on. It was justifying charging political dissidents in the United States with espionage. 

Ben Yelin: The other thing is it wasn't - that particular line didn't have a role in that decision. It's what we call dicta in the legal world. It's just sort of something that Justice Holmes, Oliver Wendell Holmes, was saying and not something that was entirely relevant to the case. 

Dave Bittner: Just kind of an aside. 

Ben Yelin: It's an aside, exactly. And, in fact, Justice Holmes ended up changing his view on the Fourth Amendment as the years went on. He actually wrote a very famous dissent saying we need to have more robust protection of the First Amendment, even if words are particularly dangerous, because we want to protect this so-called marketplace of ideas. 

Ben Yelin: That's where we are now. The First Amendment has very robust protection, even for false information. And that's because there are very few exceptions to the right of the freedom of speech. There are exceptions related to torts like defamation and libel, especially as it applies to private individuals. That's not happening here. You know, there are exceptions related to profanity and obscenity. Those things kind of fall outside of First Amendment protection. False advertising - you can't claim that your vitamin cures cancer and have... 

Dave Bittner: Right, right. 

Ben Yelin: ...The government allow that to go on air. But in terms of regulating what private individuals are telling one another in whatever forum, that just doesn't fall under one of those exceptions. 

Ben Yelin: The one I think that might be the closest is what we call speech that has the tendency to cause an imminent lawless action. So a lot of people would say, like, this type of speech - spreading misinformation about this virus is going to get people killed. Isn't that similar to starting a riot by alleging that a person in the vicinity is a traitor, has committed a crime and spurring your violent mob into action? 

Dave Bittner: Right. 

Ben Yelin: I just don't think the Supreme Court would see it that way. That was a very narrow decision when they came up with this principle of imminent lawless action. It requires some level of immanency where there's no opportunity for cooler heads to prevail. 

Ben Yelin: I think what the Supreme Court would say, what legal scholars would say here is our protection is in this marketplace of ideas. The way to respond to false information is by saturating our discourse with true information. 

Dave Bittner: Those old-fashioned social norms, right (laughter)? 

Ben Yelin: Exactly. Now, that all sounds really good in theory. And a lot of law students tell you, you know, I believe in the Justice Holmes dissent in the Abrams case. I support this marketplace of ideas. If I'm a family member of a person who's died because they got false information, you know, that's not going to be satisfying to me. 

Ben Yelin: So the question is, you know, what can we do about it? I think the best angle to focus on is how we can regulate social media because this information spreads most quickly on Facebook and other social media platforms. That's, you know - Facebook has some agency as a private company. They can't arrest you, but it can ban certain false or misleading information. And they have in a number of circumstances. 

Dave Bittner: Right. 

Ben Yelin: They put little warnings on posts about things related to the election. And we know because of what we talked about with Section 230 that they can't be held liable for some of these content moderation decisions. So I think our ire has to be focused on these large social media companies who are facilitating the spread of information and how we can stop them from doing so. 

Ben Yelin: I don't think we're going to ever end up in a situation where the government is arresting people for spreading false information. That's just not in the constitutional tradition of this country. And I apologize for this very long-winded answer. 

Dave Bittner: No, no. I mean, it's great to actually get the real information on it. I mean, the whole thing about shouting fire in a crowded movie theater - I mean, I think most of us have heard that, you know, our whole lives. And I had no reason to really think twice about it. So I find that fascinating. 

Dave Bittner: OK, so let's continue down this path a little bit. I have a couple questions for you. What about in times of emergency or war? Are there any - does that change anything? 

Ben Yelin: So there have been narrow First Amendment exceptions in times of war related to extremely sensitive things, like troop movements. So if you were to reveal where a brigade was going in Vietnam, then that's not protected under the First Amendment. That, again, is a very narrow exception. That doesn't apply to any sort of dissenting opinion on the emergency that's potentially based on false information. So I don't think that's going to save anybody one way or another. 

Dave Bittner: OK. 

Ben Yelin: We do have restrictions on the First Amendment inherent in emergency laws. If states are instituting stay-at-home orders, you know, I can be arrested if I go hold a rally in the middle of a busy town square. But that is what we would call a time, place and manner restriction. They're not forbidding me from saying a particular thing. 

Dave Bittner: Right. 

Ben Yelin: They're forbidding me from speaking in that particular setting. 

Dave Bittner: Right. You could still go livestream on YouTube and say whatever you wanted to from the comfort of your home. 

Ben Yelin: Exactly, exactly. 

Dave Bittner: Yeah. 

Ben Yelin: So what the Supreme Court really disfavors is content-based restrictions on speech. You have to have a darn good reason to do that. And that means that you're protecting a lot of false information. 

Dave Bittner: Yeah. 

Ben Yelin: I don't want to defend the courts too much on this because I do think, you know, that's not - that's going to be small comfort to people who are being tangibly hurt by this. 

Ben Yelin: But there is a real line-drawing problem here. I mean, how does the government conclusively determine what counts as false information? You know, how do they prioritize prosecution? How do we figure out who was the first person to spread this disinformation? And there are a lot of really difficult questions there. And I think that goes into this general philosophy of let the marketplace of ideas flourish and hope that the truth wins out, even though it doesn't. 

Dave Bittner: Yeah. You know, to your idea of some regulation for the social media companies, I've been thinking a lot about this idea of, you know, do we need something akin to the FDA for social media companies? 

Dave Bittner: In other words, you know, for the FDA, a new medicine comes out, and it has great promise and the ability to help people, you know, cure things, whatever. Well, that has to be tested. It has to be - it goes through rigorous testing before it's allowed to be put out there with the public. 

Dave Bittner: I wonder if a similar thing with algorithms needs to be put in place where before you start using these algorithms to put information in front of people, it has to be tested. It has to - you have to demonstrate to us that it will first do no harm. 

Ben Yelin: Yeah, I mean, the social media purveyors in this country would you-know-what a brick if that happened. 

Dave Bittner: (Laughter) Yeah, right. Yeah. 

Ben Yelin: They're making these types of decisions on a daily, sometimes hourly basis. They're very dynamic. You know, they're innovators. And they would hate to have to go through a stringent government approval process. So I don't - I - that's a very interesting idea. I don't see that happening anytime soon. 

Dave Bittner: Yeah, it's like, meanwhile, back in the real world... 

Ben Yelin: Yeah, exactly. 

Dave Bittner: (Laughter) Yeah. 

Ben Yelin: At least, you know, in the comfort of this podcast, we can talk about these interesting theories, but... 

Dave Bittner: That's right. That's right. In our ivory tower on our perches. 

Ben Yelin: Yes, exactly. That's not something that I would foresee happening in the future. 

Dave Bittner: We'll have a link to that Twitter thread in the show notes if you want to check that out. 

Dave Bittner: We would love to hear from you. If you have a question for us, you can call us. It's 410-618-3720. You can leave a message, and we may use your question on the show. You can also send us an email. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure speaking with Laura Noren She is an NYU visiting professor of data science, and she is the VP of Privacy and Trust at Obsidian Security. Our conversation centered around the notion of who governs the cloud and what sorts of data protection regulations are actually enforceable. Here's my conversation with Laura Noren. 

Dave Bittner: Capable guardianship - that is something we haven't really dug into on this program before. I was wondering, could you start off by giving us a description of what exactly that means? 

Laura Noren: Sure. Yeah, that's the way we look at privacy from our perspective at Obsidian. It means not so much that we think of data in an ownership model, which is often how it's discussed. We don't think of owning data from our customers, right? That isn't really accurate. 

Laura Noren: We do think that they're trusting us to take capable guardianship of it. They want insights from it that they would like us to provide. And they also want to make sure that no additional insights are being derived from that data that they're trusting us with. And so that's how we look at data, and it really helps us structure our privacy program. 

Laura Noren: And we also - I would love it if more of the world looked at it this way. We are seeing some data trusts forming in other parts of the data ecosystem. So health data sometimes is seen in that way. And I would say the medical data ethicists are kind of ahead of some of the field by looking at data that way because it makes it a little bit more possible to think of being entrusted with data without having to think of just completely limiting people from using data, which a lot of the regulators seem to be struggling with. 

Laura Noren: How can we allow companies to provide value using data without completely crippling what they do by putting national boundaries around it or telling them they have to delete certain parts of it that may or may not actually increase the privacy or the empowerment of the end user all that much? But it does certainly provide some compliance hurdles for people who are really trying to do the right thing, which I think most people are really trying to do the right thing. 

Dave Bittner: In an ideal situation, how would you define data? 

Laura Noren: Oh, data is anything that is a digital record of a person, an activity, anything like that. I suppose, probably, there are some artists out there who could even broaden that by saying it doesn't have to be digital. But when I think of data, I think of it as a digital record that is either a reflection of an identity or an activity. 

Dave Bittner: And where do we find ourselves right now in terms of, as you put it, the guardianship of that data in different nations around the world? 

Laura Noren: The reason that this has become such an intense debate now - data, in fact, are not all that new. We started taking census data, you know, over a hundred years ago. So data isn't what's new. But what is new is that we've never had such a complete data record of individuals or activities ever before. 

Laura Noren: We sort of got to this place, along with Moore's Law, and we've now graduated to a place where we can have a fuller data record of a person than they would even be able to remember about themselves. So the digital has in some ways exceeded the human, which is pretty remarkable. And it raises a lot of emotional considerations about, you know, how do people feel about this? How should societies address this particular turn of events? 

Laura Noren: And we've seen that some countries have decided that they would really like to be, I would say, strong players in the data guardianship game. And they're passing a lot of laws. And those laws include things like the GDPR that gives data subjects the right to know what's been held about them and to correct that record and to maybe delete that record. 

Laura Noren: And those are not necessarily horrible things, but it's not quite in tune with what - if you think through that capable data guardianship model, there's maybe kind of a weakness on the other side of that. You know, what if I do want to trust a company with my data, and I don't really want to pull the nuclear option, which is to just delete it if things aren't going well? What other tools can I use to empower myself to get that company to give me better service besides just deleting data or maybe correcting it here and there? That's really a limited model. 

Laura Noren: So we like to think of capable data guardianship in a much broader way. You know, if a customer comes to us with (unintelligible) requests or with a different way that they want us to deal with their data, we're really open to that. And these legislations - you know, as always, it's got to fit a much broader group of people, and it tends to be rather inflexible. 

Laura Noren: So it's been really interesting to see as some countries are moving towards - and I assume this is a pendulum that will swing back because this is not going to work. They're moving towards just trying to keep all of their data within their country so that they can really tend to it very carefully. And that is sort of out of step with the way that most companies are moving into the cloud. And the cloud doesn't really fit in a particular country. I mean, sure, servers do, but it's just not really the right regulatory approach. It's going to be disempowering for those countries that continue to follow that goal. 

Laura Noren: You don't really - I mean, as a country, you don't want to opt yourself out of moving to the cloud. That's going to have a lot of economic repercussions that are not desirable. 

Dave Bittner: Yeah. I mean, it strikes me, and I'm sure I'm oversimplifying it here, but, you know, for example, if I move my data to one of the big cloud providers, I don't really have any idea how they're backing things up. They may be storing their backup copies of my data, which I would think is a good thing that they're backing up my data, but I don't know where it is physically. It could be anywhere in the world. 

Laura Noren: Right. And do you particularly care? I mean, you do care if there's a failover problem. But if they fail over into a country that's not the U.S., is that your main concern as a user? Probably not. 

Laura Noren: And so this is kind of the conversation at the legislative level. And there was just a new ruling today that kind of furthers the EU down this path where they're trying to keep a lot of their data within country. And their current concern is that there may be nation-state surveillance apparatus that is collecting data from internet service providers and telecom providers and that that would be too much surveillance, and so those data transfers to those countries should just stop. 

Laura Noren: Obviously, that's not going to happen. You know, it's kind of the current kerfuffle around the privacy shield was brought up by concerns like that. But the nation-state-level surveillance isn't particularly new. 

Laura Noren: Again, there is something new under the sun. Like, the amount of data that you can derive from any given person is much more complete, and it's much cheaper to replicate, so you can store it for a longer period of time. That's new, and that is something we should be thinking about. But the idea that you're going to address that by only letting data be in certain countries, it's just not the right solution for that problem. 

Laura Noren: And I speak, you know, really as a tech ethicist, not as someone who's out there advocating for startups to change the world with new products. It's just - it's not a very robust solution. 

Laura Noren: And I wish that there were a more robust solution coming from the governmental sector because I do think there need to be some guardrails on what can be done with data. I just don't think that keeping it in one country, it's not going to address any of the actual ethical concerns, and it will certainly create a lot of headaches that are basically unnecessary. 

Dave Bittner: Do you suppose that there are potentially technical solutions to this that would make it irrelevant? I'm thinking of, you know, things like encryption. If the data were locked up tight, then it really wouldn't matter where it was stored if no one can look at it. 

Laura Noren: Yeah, I think encryption is definitely a huge component of this. And it's interesting that privacy and security often get spoken about in different committees or by different groups of people. Were, you know, those two were working together, we'd probably get by a little faster. And that - certainly, encryption has entered the privacy conversation, and security was always a little bit of privacy anyways. So that's an excellent strategy. 

Laura Noren: I come from an educational background as a professor for a long time, so I think continuing to educate people about what they need to be concerned about, where the real risks are is also a very empowering strategy. I think a lot of our society has kind of moved into a direction where most things are dealt with by experts, and the average person really can't understand most of what's - like, I can't go out and fix my car. I just can't. But 60 years ago, many people could have at least changed their oil or switched out their spark plugs or something. I can't do that. 

Laura Noren: So as we move more into these expert niches - it makes sense; you get better technology if you do that - you need to ramp up that educational component a little bit. And that is kind of part of the GDPR. Transparency is one of the pillars that guide that piece of legislation. And the underpinning to transparency is the assumption that if people understand better how these systems work and what the implications are of taking certain choices, they can then make those choices in an informed way. 

Laura Noren: I think mostly what we talk about when we talk about privacy is really power. How is the end user empowered to make choices that work for them? If you flip the conversation that way, it gets a lot more interesting, and it makes a lot more things possible. 

Laura Noren: If you're thinking of just protecting privacy, then it's not that surprising that people think, well, maybe if we just don't let the data spread around, it will be more private. Well, it might be, but putting the data in a closet's probably not going to get you where you want to go anyways. It could be private, but it's not going to be empowering at all. 

Dave Bittner: Where do you suppose we may be headed with this? Do you have any sense for likely directions that we may see these issues going? 

Laura Noren: Well, OK, for your listeners, this is, you know, sheer prognostication here. What we have seen is that as the U.S. and China really advance their AI capacity and have really outstripped most other countries quite a bit in terms of developing machine-learning techniques and building up the datasets required to do that well or to do it poorly - we're doing it; it's happening, and we're advancing rapidly there - we've seen Europe kind of enter the national - international stage as the data regulator of the world. 

Laura Noren: Of course, it's very frustrating because, the U.N.'s ICJ notwithstanding, we don't really have an international regulator. So for the EU to just self-nominate into that role has been frustrating for a lot of companies and a lot of countries. Some within the EU itself, I think, are a little bit frustrated with where that court is going. 

Laura Noren: I anticipate that we'll just keep seeing that. We'll see a lot of technological developments in the U.S. and China, and we'll see a lot of regulatory development coming out of the EU. 

Laura Noren: And we have also seen California and its legal exceptionalism trying to put in place the CCPA, which is existent. And it is back on the ballot again this fall, and it looks likely to pass. So that will also increase some of the guardrails around data in California, which does tend to act as sort of a high bar in the U.S. for what can be done with data. 

Laura Noren: And they copied a little bit of the GDPR in the sense that they made it basically apply to the whole country. If you do business in California, then you must abide by the California Consumer Privacy Act. And just having a website that's accessible from California, as long as you have a lot of customers, will put you in scope for that. 

Laura Noren: So I assume that we'll see more of these thought leaders in the legislative arena copy the GDPR and try to make their legislation as broadly applied as possible. So the cat and mouse will sort of be between tech innovators in the U.S. and China and regulators mostly in the EU, but a little bit in California. It will probably continue. 

Laura Noren: And we're seeing the EU and a few other countries outside the EU really move towards trying to keep data within their national boundaries so that they can strengthen up their capacity to oversee what happens to it and bring cases against companies that have legal standing. They enjoy large fines as their main mechanism. That hasn't been scary enough, so now they're trying to shut down data transfers altogether, which the Irish data protection authority did ask Facebook to please stop transferring data outside of the EU, and Facebook sort of politely declined. So that's how that's going to go. 

Dave Bittner: Well, and it seems to me that it's very easy and often happens that people conflate privacy and security. 

Laura Noren: Yes. 

Dave Bittner: And they're not the same thing. 

Laura Noren: And they go together a lot until they go head to head against each other, right? I would prefer that we just don't retain data for very long, and my security team would prefer that we retain it for as long as possible. 

Dave Bittner: Right, right. 

Laura Noren: So there are certain fundamental differences, but both of us really think you should encrypt everything, you know, in transit, at rest, and be careful about that. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: Very cool conversation. I'm kind of fixated on what she was saying at the end... 

Dave Bittner: Yeah. 

Ben Yelin: ...Which is that, you know, the interests of privacy and security are usually aligned, but there comes a point where they're not always aligned. And, you know, I think that has big implications on a micro level at workplaces. There are going to be different priorities in terms of protecting the integrity of information and maintaining privacy, which means that you might be doing it wrong within your organization if you are 100% focused on securing information and not focused on protecting private information. I just thought that was an eye-opening and really interesting thing for her to say at the end there. 

Dave Bittner: Yeah. It's very interesting to me from a broad point of view how often security and privacy get lumped together in one bucket, and they're not the same thing. Certainly, there's overlap, but they are different. 

Ben Yelin: Right. They are different. Security, as she said, sometimes involves collecting even more data so you can properly analyze your risk. And the goal of privacy advocates is to collect as little data as possible. So sometimes there really are conflicting interests. And in the cybersecurity world, I just think we can't have this default assumption that if something is in the interest of security and protecting data that it's necessarily in the interest of privacy. 

Dave Bittner: Yeah. All right, well, again, our thanks to Laura Noren for joining us. So we do appreciate her taking the time. 

Dave Bittner: We want to thank all of you for listening. That is our show. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.