Data privacy around the world.
Annick O'brien: Because it was less about the how and more about the why. It was less about can I prove that we did this thing and can I check this box to say that it's done, and more about have I actually, properly communicated to people how their data is being used?
Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: Today, Ben talks about the debate around reauthorizing Section 702 of the FISA Amendments Act. I cover the online sale of your mental health data. And later in the show, Annick O'Brien, global data privacy lawyer at CybSafe.
Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: All right, Ben, we've got some good stories to share this week. Why don't you start things off for us here?
Ben Yelin: So this story is kind of a heads-up. In December of this year, Section 702 of the FISA Amendments Act is due to expire. So Congress is going to have a debate around whether to reauthorize that act, either in whole or in part. Just to refresh people's memory, Section 702 of the FISA Amendments Act in a national security surveillance program revealed - or details of it were revealed as part of the Edward Snowden disclosures a decade ago. It allows for the National Security Agency to collect communications from non-U.S. persons reasonably believed to be outside of the United States for foreign intelligence purposes.
Ben Yelin: Now, most people wouldn't care about that because, hey, they're not Americans. Why should we care about their communications being collected? But unfortunately, for those of us in the United States, there's something called incidental collection, meaning if a U.S. person is on the other end of that communication with an overseas target, then that entire communication, that discreet communication can go into a searchable database. And generally, the government does not need a warrant to access that database. So the fear is that this would be a form of a backdoor search. If you don't have probable cause to surveil an American, catch them talking to an overseas target, then they can get access to that - those communications in this sort of backhanded way.
Ben Yelin: Congress last reauthorized Section 702 in the very beginning of 2018, and they only made minor changes to the statute. One minor change they made is if there is an advanced criminal investigation and the government wants to search the NSA database for evidence in that advanced criminal investigation, they do have to obtain a warrant. Other than that, though, in any other circumstance, including a situation where you might be curious if somebody has committed a crime but you're not at that endpoint state of a criminal investigation, you are free, the government is, to search that database.
Ben Yelin: So this article that kind of was the basis for this discussion came from LawFare, from Jeff Kosseff, who is a really prominent academic in this area who works for the U.S. Naval Academy Cyber Science Department. And his argument is that if Congress wants to protect Section 702, it needs to rein in the FBI. And I think that this is critically important. We've now had a series of reports going back at least five years where the FISA court itself, in opinions that are heavily redacted but later declassified, say that the FBI isn't following proper querying procedures when searching this data. There have been, basically, an uncountable number of violations of the spirit of the statute. In 20- - when the report was released in 2017, one of the FISA court judges detailed really widespread abuse, whether it was an intentional or not, on the part of FBI employees because there's this kind of serious risk of unwarranted intrusion into the private communications of U.S. persons. That kind of was a shot across the bow to the FBI that it needed to clean up its act. And...
Dave Bittner: (Laughter) Say, hey, knock it off.
Ben Yelin: Yeah. And apparently that has really not taken place. There's a 2019 opinion that - that same FISA judge, held that there still appears to be widespread violations of the querying standard by the FBI. So among some of the unauthorized FBI uses of Section 702, they screened a local police officer candidate to figure out whether that person was qualified to be hired for a job - shouldn't be using a 702 database for that - vetting a potential confidential source, investigating people who visited an FBI office to perform maintenance, investigating college students who applied for the FBI Collegian Academy (ph). So these are pretty blatant violations of this statute, which is intended to concern searches related to national security.
Ben Yelin: There are a couple of complicating factors here. The first complicating factor is the FBI and FISA have been under the political microscope, largely because of the Crossfire Hurricane investigation, and some anger from political conservatives in this country. They think the FBI was on a witch hunt against Trump and there were abuses in the FISA application for the warrant on Carter Page, who was a former Trump campaign employee. More importantly, though, is this discussion really wouldn't be worth it, necessarily. I think, you know, in other circumstances, it might be easy to just say, let's just scrap this program.
Ben Yelin: Any person who works in intelligence and signals intelligence will tell you that Section 702 is the crown jewel of our intelligence apparatus. It has been used countless times to stop terrorist plots in this country and around the world. And it was the authority used to obtain Ayman al-Zawahiri in Iraq. So it provides immense value, so much so that Paul Nakasone, who is with the NSA, went in front of a recent hearing by the Privacy Civil Liberties Oversight Board, which is a executive branch entity that reviews these programs, talking about the need, for national security purposes, to reauthorize the statute.
Ben Yelin: So I think these are really setting the parameters for what should be a really interesting debate in 2023, as this act is due to sunset. And I'm just very curious as to where this is going to go and who is going to put a kind of proverbial spike in the tires of this program as it marches forward.
Dave Bittner: Is there any discussion as to what a possible compromise could be to still give the signals intelligence folks what they need from this, but maybe put some speed bumps in the way of the types of things that the FBI's alleged to have done?
Ben Yelin: Sure. So the main alternative proposed by civil liberties advocates from both parties would be to prohibit the FBI from querying the Section 702 database for any purpose other than obtaining foreign intelligence information.
Dave Bittner: Does - isn't that already the way it's supposed to work (laughter)?
Ben Yelin: Well, no. I mean, the law right now says that once those communications are lawfully collected, it's in the database.
Dave Bittner: OK.
Ben Yelin: And legally, for Fourth Amendment purposes, you do not need a warrant to search that database in most circumstances.
Dave Bittner: So this is more of a spirit-of-the-law versus letter-of-the-law thing.
Ben Yelin: Right.
Dave Bittner: OK.
Ben Yelin: And it's also - the Fourth Amendment, though - you know, we'd like to think that all of these are easy questions; the language of the amendment is very clear;. It's really not. And Fourth Amendment jurisprudence is very frequently about this rather nebulous term of reasonableness. And reasonableness is determined by a totality of the circumstances. So a search can only be reasonable if the government's interests strongly outweigh the invasion of privacy on Americans' communications. And one of the ways you can alter that balance would be to institute a ban on warrantless searches of this database. Now, national security experts would fight against this, and they have many allies in Congress, because they think this would be too cumbersome a requirement. If they have information on somebody that they are pretty sure has committed a crime or somebody who is a threat to commit acts of terrorism, they're not going to want to wait to obtain a warrant, whether it's from a district court or from the FISA court itself.
Ben Yelin: There have been other suggestions. There was one made last time - this is up for reauthorization - who believe that each FBI query of already-collected Section 702 data should be considered a separate search. So there should be, really, an individualized analysis for Fourth Amendment purposes for every single distinct search of Section 702 data, not on a programmatic level. So those are just a couple of the potential solutions here.
Ben Yelin: But national security - the national security state - and its representatives are going to fight tooth and nail to maintain this authority the way it is. I suspect that the Biden administration will be deferential to intelligence agencies and also fight to have this largely extended as is because it's such a valuable counterintelligence tool.
Dave Bittner: So just so I can be really clear in my understanding of this here, I mean, the purpose of this is to collect data on foreign nationals, right?
Ben Yelin: Right.
Dave Bittner: Right. And so sometimes U.S. citizens have conversations with foreign nationals that might be interesting for people who need it for national security, right?
Ben Yelin: Right. So if I'm communicating with a terrorist overseas and there's lawful surveillance of that terrorist, I very well might say something incriminating during that phone call or that email or whatever.
Dave Bittner: Right.
Ben Yelin: And the federal government might want to use that to initiate or to continue a criminal investigation into me. Now, normally, to access my own private communications, you'd need a warrant.
Dave Bittner: Right, right.
Ben Yelin: But this is an end around if somebody is talking to an overseas terrorist.
Dave Bittner: So could it be as simple as, if you want to search this database, you can only search for foreign nationals; you cannot search for U.S. citizens because their data is being vacuumed up incidentally, and not primarily?
Ben Yelin: So, yeah. That is definitely a workable solution. It does go against how, at least, courts have considered Fourth Amendment claims around incidental collection. So most of the previous cases relating to incidental collection related to something called incidental overhear. Basically, let's say you have a warrant to surveil Mafia Guy 1, and Mafia Guy 1 is having a phone call with Mafia Guy 2. You don't have a warrant to surveil Mafia Guy 2, but Mafia Guy 2, in the process of surveilling Mafia Guy 1, might say something really incriminating, and the FBI would not have to get a separate warrant in those circumstances to go after Mafia Guy 2.
Dave Bittner: I see.
Ben Yelin: So I think in the view of the intelligence community, that same standard should apply here. That's how courts have interpreted it as well. There've been several federal court cases on Section 702. All of them have basically come to the same conclusion, that these searches are reasonable if you kind of look at the incidental overhear doctrine and the fact that people really should have a diminished expectation of privacy when they're communicating with anybody overseas, just because people know, or should know, that overseas targets don't have the same Fourth Amendment protections that U.S. persons do - not saying any of that is right. That's just how courts have seen this issue, and that's, I think, how the FBI and the NSA would argue during this reauthorization debate.
Dave Bittner: Right. So, again, forgive my low-level questioning here, but if you - let's say I was hiring - I'm a government agency, and I'm hiring - let's say I'm the FBI, and I'm hiring, you know, Ben Yelin for a job.
Ben Yelin: Terrible decision, FBI.
Dave Bittner: Yes (laughter).
Ben Yelin: Yeah.
Dave Bittner: Just an extreme hypothetical. If I wanted to find out if Ben has been having any phone calls with foreign nationals, in general, separate of 702, I would need a warrant to get that information, yes?
Ben Yelin: Yes.
Dave Bittner: OK. So that's the issue here...
Ben Yelin: Well...
Dave Bittner: ...Is that 702 makes it so that now I have a database that I can search because of the information being gathered on - I can go on a fishing expedition searching for your name to see if you've had any conversations with foreign nationals without a warrant.
Ben Yelin: That's exactly right.
Dave Bittner: OK.
Ben Yelin: Yeah. Now, if you were concerned that I had illicit conversations with a U.S. person, completely different story. You'd need a warrant to access those communications.
Dave Bittner: Right.
Ben Yelin: You know, that's really the bread and butter of the Fourth Amendment. But we do have case law. There's this famous 1990 case, Verdugo-Urquidez, which stands for the proposition that non-U.S. persons who are not located in the United States don't have Fourth Amendment rights. They are not part of what the Supreme Court called the national community, for the purposes of the Fourth Amendment. When you combine that with this incidental overhear doctrine, the combination of those two legal theories would lead to the result that we get, which is that U.S. persons' communications are available in this database, and they are searchable.
Dave Bittner: What's your sense for how this could go?
Ben Yelin: Ah. I wish I had a crystal ball on this.
Dave Bittner: (Laughter).
Ben Yelin: I also wish there was a - like, a gambling website where you could...
Dave Bittner: (Laughter).
Ben Yelin: ...Take bets on the likeliness of this being....
Dave Bittner: I don't see why not.
Ben Yelin: ...Reauthorized.
Dave Bittner: You can bet on everything else these days.
Ben Yelin: I know.
Dave Bittner: Believe me, I've seen...
Ben Yelin: Come on, FanDuel.
Dave Bittner: We've all seen the ads.
Ben Yelin: Right.
Dave Bittner: (Laughter) Put in your bet on what the Supreme Court will do.
Ben Yelin: I'm trying to think of, like...
Dave Bittner: Your first bet is free.
Ben Yelin: Yeah, exactly.
Dave Bittner: (Laughter).
Ben Yelin: Five dollars in free bets. My general instinct is that the path of least resistance is a lot of arguing and debate, but then, ultimately, this getting reauthorized with only minor alterations and changes. I could very well be wrong. You know, there is a subcommittee in the House of Representatives right now that's been formed to go after alleged misconduct in the FBI. Does that inform how House Republicans see Section 702? I'm not sure. You know, what's interesting - what happened last time is Congress was about to reauthorize Section 702. There was a Fox News reporter named, I believe it's, Andrew Napolitano, who's been a longtime opponent of Section 702, wanted to scuttle this reauthorization. And during the Trump years, there was really one way to get in the president's ear, and that was to go on "Fox & Friends" in the morning. So he did the day that the vote was coming up on Congress and said, you know, this is the program that was used to surveil President Trump. It worked. Trump wrote a tweet saying, basically, we should not reauthorize this. This is the tool that was used to spy on me and my supporters. And then you can kind of tell that Paul Ryan, who was speaker of the House at the time, freaked out, called Trump, and Trump released a separate tweet, like, 2 hours later, saying, this is just the program that's intended to target terrorist bad guys...
Dave Bittner: (Laughter).
Ben Yelin: ...So we should approve that. And it was like, OK, so that's exactly how they explained to him over the phone.
Dave Bittner: I see.
Ben Yelin: Got it.
Dave Bittner: OK.
Ben Yelin: So it's just kind of a funny anecdote. I would guess, if I had to handicap it, that I do think it'll be extended, with some minor amendments, some minor policy changes.
Dave Bittner: Yeah. I mean, is there some way to do a hand slap to the FBI and say this time we mean it?
Ben Yellin: There is. There are a couple of problems. You know, whenever we get one of these FISA court opinions, it's always kind of a view into the past. It's like when you look at a star. You're seeing what happened...
Dave Bittner: Oh, yeah.
Ben Yellin: ...Billions of years ago.
Dave Bittner: Right.
Ben Yellin: Because they're only declassified, you know, one or two years down the line. And we don't know whether whatever was wrong with the previous application of the program is still an issue. There's just that time lag. There are things they can do to increase transparency, and I think that would be a very realistic amendment here. One of those things would be to simply require reports - semiannual, annual - on how many U.S. communications are being captured through this Section 702 database. It's been very difficult for journalists and even government agencies to obtain that data. So some requirement for that data to be made available in regular intervals to the public might be a way that we can get some reform on this.
Dave Bittner: OK. All right. Well, as you say, it's one to keep an eye on. It's fascinating stuff.
Ben Yellin: Yeah. I'm sure we'll come back to it as the year goes on. I mean, knowing that it's Congress and they have nine months to work on this, I'm guessing we're not going to see any actual, you know, legislative action until October or November at the earliest.
Dave Bittner: Right. Right. When it comes to Congress, there's no minute like the last minute.
Ben Yellin: Exactly.
Dave Bittner: All right. Well, we will have a link to that story in the show notes.
Dave Bittner: My story this week comes from The Washington Post. This is an article written by Drew Harwell, and it's titled "Now for sale: Data on your mental health." And this article was prompted by a study that was published just earlier this week from a research team at Duke University Sanford School of Public Policy. And it's really about the sale of private information, but in this case, the sale of mental health information. So the researcher from Duke, Joanne Kim, she found 11 companies willing to sell bundles of data that included information on what antidepressants people were taking, whether they struggled with insomnia or attention issues and details on other medical ailments, including Alzheimer's disease or bladder control difficulties. Some of that data was offered in an aggregate form. So, for example, you could see how many people in a particular zip code might be suffering from depression. But other data was available that could be connected with someone's address, and, you know, connected in a very - in a way that it wouldn't be that hard to figure out who the person was who was having this information.
Dave Bittner: This article goes on to talk about some of the really creepy types of information that was being tracked here.
Ben Yellin: Yeah. This is really bizarre, when you get into these pieces of information being tracked.
Dave Bittner: Yeah. There's one in particular that I wanted to highlight here. Let me find it in the article. In 2013, Pam Dixon, who is the founder and executive director of the World Privacy Forum, which is a research and advocacy group, testified that an Illinois - testified before a Senate hearing that an Illinois pharmaceutical marketing company had advertised a list of purported rape sufferers with a thousand names starting at $79.
Ben Yellin: God, that's just gross. Hearing it out loud - yeah.
Dave Bittner: It's despicable. Yeah. It's absolutely despicable. So I think at the root of this is kind of this disconnect between - I guess, a gaping hole in HIPAA of our personal, private health information that has to be protected by certain organizations. The health professionals...
Ben Yellin: Right.
Dave Bittner: ...Have to protect our data through HIPAA, which is what HIPAA was intended to do and by all accounts does a pretty good job of it. But these data brokers are able to collect, aggregate, connect our data, including what I would say - I mean, how much more intimate a detail about a person than whether or not they've been raped.
Ben Yellin: Yeah. I mean, that part is really disturbing to me. HIPAA applies to hospitals, doctors offices and so-called covered health entities which share American's health data.
Dave Bittner: Right.
Ben Yellin: Because of the pandemic and just because of advances in technology, particularly when it comes to mental health, we are sharing information among a lot of - or with a lot of different companies and organizations that aren't protected under HIPAA. And I think that's a real weakness of that law. I don't think the drafters of HIPAA ever could have imagined the vast online space where people would be sharing private information, and that that information would be valuable enough to sell to brokers. Brokers could sell them to make a profit. I don't think the drafters of HIPAA in the mid-1980s could have foreseen this. It would be nice if lawmakers would do something about this.
Dave Bittner: Yeah.
Ben Yellin: But this goes back to, like, the thing we basically say every week on this podcast. This is - data brokers are completely unregulated at the federal level, and there are only a couple of states that have comprehensive regulations on selling personal data. We have no federal data privacy law. This is a full-on vacuum. And until we have that law and until HIPAA is modernized so that it covers other entities that collect our personal health information, you know, we can admonish these companies all we want. And certainly we do. It's immoral what they're doing.
Dave Bittner: Right.
Ben Yellin: But the fact of the matter is it's profitable, and they're going to do it as long as they can get away with it.
Dave Bittner: Yeah. This article points out that there are some senators - Senator Elizabeth Warren, Ron Wyden and Bernie Sanders, who I suppose, in this context, we could - it's fair to call them the usual suspects, right?
Ben Yellin: Yeah. I feel like we've mentioned - if you did a word search on Wyden in the transcripts of our episodes...
Dave Bittner: Right, right.
Ben Yellin: ...You'd get a lot of hits.
Dave Bittner: Yeah. So they're backing a bill that would strengthen both state and federal authority against health data misuse and restrict reproductive health data, how much of that can be collected and shared. And, of course, this is - also needs to be put in the context of the Supreme Court overturning Roe v. Wade and data being available for folks who are seeking or who have visited abortion clinics...
Ben Yelin: Right.
Dave Bittner: ...For abortion medication. This data is being collected that could be - the consequences could be that someone could be accused of committing a crime.
Ben Yelin: Yeah. I mean, data brokers are interested in, can we sell this person diapers? But we're talking now, potentially, in a post-Roe era, about criminal prosecutions being predicated upon data that was collected without the users' knowledge and has just kind of floated around through data brokers from company to company and ends up in the hands of law enforcement, either via subpoena or some other investigative tool. And that's - you know, even regardless of how you feel on the abortion issue itself, that's certainly a matter of concern as it relates to user privacy. And there's only so much - you know, our federal agencies, I think, are taking a piecemeal approach at attacking this problem.
Ben Yelin: We've talked about a couple of FTC actions related to inflicting civil penalties on organizations that sell this personal data. We talked about the one last month from the online prescription drug service GoodRx after the company was charged with compiling lists of users who bought certain medications and using that information to target Facebook ads. The FTC can do that in limited circumstances. But until we actually have a comprehensive federal data privacy law, we're really only going to be able to attack this problem on the edges and not at this really foundational level of, you know, why is there this entire industry centered around our own personal information, and what can we do about it?
Dave Bittner: Yeah. I can't help wondering if this is the kind of thing that will only see attention if someone like comedian John Oliver were to buy up the data, all of the mental health data that he could...
Ben Yelin: Of congresspeople?
Dave Bittner: Of Congress. Yeah.
Ben Yelin: I think that's coming.
Dave Bittner: Right.
Ben Yelin: I think we just gave him a good idea for an episode.
Dave Bittner: Well, I mean, he - remember, he already did it. There was an episode he had, I want to say - oh, gosh - over a year ago, where he was buying location data of congresspeople.
Ben Yelin: Yup.
Dave Bittner: Not much seemed to have come to that, but I'm just imagining the episode where if he bought up information like this and could - would even threatening to name names, would that be enough to rattle Congress enough that they would do something about this? I don't know.
Ben Yelin: I don't know. It's probably our best shot. I still think John Oliver could make a really interesting episode out of it.
Dave Bittner: Yeah.
Ben Yelin: I think he - if I'm not mistaken, he's already done something on data brokers.
Dave Bittner: Yeah.
Ben Yelin: I think specifically related to mental health would be a really interesting angle. I mean, it is exploiting vulnerabilities, because I think all of us have suffered the mental health effects of living through a extended pandemic.
Dave Bittner: Right.
Ben Yelin: And more people than ever are searching out online mental health services. So it is exploiting people's vulnerability. And so it's - one of the quotes here from a person involved in the study said, we shouldn't have a, quote, "tasting menu for buying people's health data." And I think that summarizes it quite well.
Dave Bittner: Yeah. I agree. All right. Well, we will have a link to this story in the show notes. Again, this is from The Washington Post, written by Drew Harwell.
Dave Bittner: Ben, I recently had the pleasure of speaking with Annick O'Brien. She is a global data privacy lawyer at an organization called CybSafe. Here's my conversation with Annick O'Brian.
Annick O'brien: Data minimization is one of the principles, of course, of GDPR. And the challenge that we find with data minimization is, firstly, that a lot of people have difficulties in organizations pushing the concept of data minimization into practice. So taking what is a principle and turning it into a process can sometimes be a little bit challenging. And, of course, the industry itself has developed. All industries develop in such a way that everybody is calling data the new oil. I'm not so sure I agree that it's the new oil, but we do see it as a currency. Data has value. So it's like saying that every organization needs to have this valuable asset in order to develop as an organization, in order to make a profit and to be successful.
Annick O'brien: And at the same time, we have pieces of law like GDPR, but there are many other types around the world. We've got PIPEDA in Canada, and POPIA South Africa. So all over the world we see different types of laws that are telling us that this asset, this thing that we make money from and make a profit from, is something that we should be trying to minimize. So it feels counterintuitive to minimize the thing that we're making our profit from. So what we do is we try and take a step backwards from this. We look at the challenges and the tensions that we find around data minimization to try and understand how can we manage these tensions in such a way that it supports our business, minimizes risk and enhances our organization as an organization that uses privacy as a competitive advantage?
Annick O'brien: And increasingly, in our modern world, of course, increasing data protection - which is a part of privacy - increasing data protection as a competitive advantage because there are a number of elements that we look at here. Data is an asset, but it is also a risk. So data is a risk. And a great way of lowering risk in your organization is to get rid of it, simply. So this is kind of where data minimization comes in, and this is where we can look at it from another type of angle. And so rather than looking at it from the angle of we're hindering our colleagues, we're impeding business processes, and we're minimizing that very thing that is valuable and creates profit and is an asset, but rather we're minimizing risk. We're minimizing the amount of data that is going around our organization that could be the source of the breach, that could be the source of a problem.
Annick O'brien: And data minimization - I think in order to really pinpoint in on what we need to do, we need to ask two questions. So I think there are two important questions that we are asking when we're looking at the concept of data minimization. Firstly, we all acknowledge that not all data is valuable. Not all data is the asset that's going to create profit for our business. Each business has specific pieces of data that is very important for them, and those are the pieces that we want to keep. So what we're looking at minimizing is not those pieces but all the surrounding peripheral pieces, all those other pieces of data that could create risk, but we don't need. They're not helping our business. They're not creating profit. We can get rid of them. So that's the first piece.
Annick O'brien: And then the second question that we have to ask ourselves is, why did we take this data? What is the reason that we have it? And this is - and this builds into the other concepts of the GDPR and other privacy laws around lawfulness, fairness, transparency, accountability and liability, as well as storage limitation. But this whole piece about data minimization is taking the minimum amount of data that you need and only keeping that data for the purpose that you need it. And if we're very clear on why we're taking it, as well as the data that has value for us, then the whole process of minimization becomes a process of risk reduction and a process of safeguarding our organization against risk. And really what it becomes is an enabling factor for us - as privacy officers or people in organizations helping with the data minimization process - it helps us to help the business to identify where the real value is, and then the business can focus on that value. And the rest we don't need.
Annick O'brien: So I think in those two ways, asking ourselves those two questions - which of the data that actually has value for us and what are the reasons that we're using it - we can help streamline business, we can help reduce risk, and we can help focus the attention of the entire organization onto the pieces of data that are going to create value and profit in such a way that aligns with the reasons that we took the data, which is aligning with GDPR and aligning with that whole idea of fairness and transparency and telling people what we're using their data for, and then being an organization that uses privacy as a competitive advantage. So by asking those two questions, we're enabling the business in more than two ways. We're enabling the business in many ways to use data in a valuable way and in a way that's going to create profit. And the minimization is about minimizing risk and minimizing superfluous data and minimizing the possibility that there is data floating around that could be the cause of a breach, that could be the cause of a problem or an unhappy data subject with a DSAR. But we don't need that data, so we've minimized it.
Dave Bittner: My sense is that we've gone through a couple of phases here where, you know, in the early days of people collecting this data en masse and when storage became cheap and ubiquitous, it was sort of an attitude of collect everything and keep it for as long as you can because why not? But it seems to me - you know, you mentioned that a lot of people say that data is like the oil of business these days. I wonder if it's more like plutonium where, you know, it can power a lot of things, but if you get too much of it in one place, bad things can happen.
Annick O'brien: I completely agree. There's the question of too much of it in one place. Bad things can happen. And then there's also the idea that too much of it makes it really hard to identify which is the valuable piece. And we all know that - if I go back to the analogy of oil, if I had barrels of oil, I'd know how many barrels I had, and I'd know where they were stored. And I know what the value of them was. And I'd have an inventory, and I'd know who I was selling them to and how I was transporting them. When we have too much data, many organizations who embark on data retention and data minimization projects and organizations that often have breaches are in a situation where, unlike oil, they don't know how many barrels of data they have. They don't know where it is. They don't know what it looks like, how it's structured, how it's protected, who has access to it.
Annick O'brien: So actually, in some ways, this idea that data - all data is an asset is something that we really should, as organizations, be moving away from. Because having that mindset kind of encourages us to hoard data. And we don't want to do that because we need to understand in order to comply with legislation, in order to be fair and transparent to our customers, to consumers, in order to comply with laws, and increasingly with international data transfer laws, we need to understand what data we have and where it's going. And when we look at developments in data governance and data localization laws, it's very important that we understand exactly which data is being held in the country, where the data was collected and exactly which data is being transferred to other countries. How, why and what are the requirements in those other countries? So that whole piece about plutonium, I think, is a really, really good example of something that you don't want to just hoard as much as you can, but rather you want to understand exactly what you need and do the correct thing or the safe things with it so as not to create undue risk for your organization.
Dave Bittner: You know, it strikes me that the EU really led the way here with GDPR, and we're a few years into that now. What is your perspective here? What are your insights in terms of where GDPR has been a success? But then also, have there been unintended consequences, unforeseen things that we've had to deal with?
Annick O'brien: That's a great question because with all major pieces of law, like we see with GDPR, there's always an upside and a downside for business. And this - that period of time, which is a period of adjustment and a period of uncertainty, and this can be difficult for businesses. GDPR is a principle-based piece of legislation. So for a lot of organizations, especially for compliance lawyers and people who are used to dealing with tick-the-box forms of compliance and audit, it was a difficult piece of legislation to integrate into business processes because it was less about the how and more about the why. It was less about, can I prove that we do this thing and can I tick this box to say that it's done? And more about, have I actually properly communicated to people how their data is being used, and can they contact me in order to understand this? These types of questions were the types of new questions that organizations were having to grapple with.
Annick O'brien: So there was a period of time when organizations had to do a lot of change from within because it was less about, again, ticking the box on the orders and more about, what is the actual result of the privacy program that we've put in place? How have we actually demonstrated accountability, one of the principles of GDPR? How can we show that we're compliant? Because I always say, compliance is not just about doing the right thing. Compliance is about showing that you've done the right thing. So there was a period of uncertainty for organizations where a lot of time and effort was spent implementing privacy programs. I think we're coming to a point in time whereby those privacy programs, to the extent that they are successful and are measurable and can demonstrate compliance and accountability whilst not impeding on business process, are working in such a way that we can say that that is in place.
Annick O'brien: However, in terms of privacy trends and what has been happening recently and what's looking to happen - what we're looking at happening in the future, we've got the whole piece around enforcement. So we see that there are a lot more fines going on in Europe around breach of confidentiality, breach of integrity, and this is the GDPR standard on security. So we continue to see fines against organizations for the breach of these principles as well as - in the future, we expect to see more fines along those lines. We've seen some recent big fines against some big tech organizations for the breach of the privacy by design and default principle. That's Article 25 - privacy by design and default - so building privacy into your product. So it's no longer acceptable to reverse-engineer privacy into something. But really, as you build and as you develop, you need to be able to show that you're thinking about privacy as you're moving forward.
Annick O'brien: Organizations are reporting less data breaches than they have previously. So in the very beginning of GDPR, we saw many, many reports of data breaches from organizations. We do see that starting to slow off. There are some countries who report more than others, and that's, in some cases, due to culture. But we do see that organizations are becoming more comfortable in what is a data breach and, indeed, what is a reportable data breach because there are two steps when we're looking at a breach. The first is, is this a breach? And secondly, do we need to notify data subjects? Do we need to notify the regulator? Not all data breaches require or, indeed, should involve reporting to a regulator and/or a data subject. But we see that starting to drop off, and we can only imagine that many BPOs, many organizations are coming to a place where they understand how to manage this risk more appropriately than simply reporting every single breach to the regulator.
Annick O'brien: In terms of trends that we see coming up in the future, there will probably be more discussion and more guidance coming from regulators around the topic of artificial intelligence because there is, of course, the enforcement of the Artificial Intelligence Act, which is coming up in Europe. And this is aiming to ensure that artificial intelligence systems in place in the EU market are safe and respect existing law and respect laws like the GDPR. So we will see more discussion and more guidance around the topics of artificial intelligence. And for any organization that is developing technology in such a way, they now have to be thinking about not just privacy by design and default but, indeed, embracing AI to the extent that they can show it as an ethical and responsible organization who uses AI in an ethical and responsible way and, again, bringing it back to that concept of using AI and using data privacy as a competitive advantage, and less so than seeing it as something that's going to impede business or impede business transfers.
Annick O'brien: And then another topic that I think we could touch on is the whole topic of international data transfers. And we may see, in the future, a new privacy shield. There may be some developments around the protections that are in place for transfer of data from the EU and Switzerland to the U.S. or, indeed, from the U.K. to the U.S. We wait to see what will develop there. But the whole idea and work around risk assessments and doing transfer assessments is something that's going to continue to be a burden on certainly small to medium enterprises. So that's something that we're going to see hopefully more developments on as we move into the future.
Dave Bittner: Are we seeing that the consumers are tracking this here? In other words, you mentioned, you know, privacy as a competitive advantage. We've certainly seen some major brands - you know, Apple, for example, is probably the most well-known - who lead with the notion that they protect and that privacy is a core value of the company. And they use that, they promote that. Are we seeing that that is the reality with consumers, that it does make a difference for them in the things that they purchase?
Annick O'brien: I think we do see an increase in interest from consumers, and we certainly see that consumers are increasingly aware of their rights, which is quite interesting. Consumers are aware of their right to object, their right to restrict processing, their right to be forgotten. We've seen case law on it, but we do see increasing amounts of DSARs, subject access requests, within organizations. And we see products that are being developed to facilitate individuals in facing these requests and to facilitate the response from organizations in responding to these requests. So certainly, we have a much more educated consumer base that understands. Data subjects do understand what their rights are, and they do exercise their rights.
Annick O'brien: So organizations do need to be very fully aware of, obviously, the rights of data subjects but how they're going to be able to comply with the law and respond to those requests. And as such, we see that organizations and businesses that are developing technology that have privacy by design and default at the core of how they are developing are succeeding on a number of factors. They're succeeding because they're showing the market that they understand that this is an important topic for consumers. They're succeeding because they're saving themselves time and money further down the line when subject access requests - DSARs - do come up because if you develop a product that enables people to manage their own data, if you develop a product that enables people to delete, to modify, to restrict, to update their own data, it means that you're saving all that time. When, later down the line somebody asks, can you please delete my data? Can you please update my data? Can I opt out of this type of processing? If your product enables people to do this themselves, it means you're saving lots of time later down the line. You don't have to respond to those requests. People have control of their own data.
Annick O'brien: This is the exact purpose of privacy by design and default. And it's also a great way to show that you understand what matters to consumers and you're giving them the power. And this will become more important as we see more laws develop around the idea of data localization. Consumers might want their data to stay in the jurisdiction that they live. They might not want their data to be transferred abroad. And in such case, having a product that's designed in such a way that an individual can decide whether or not they want their data to stay, maybe for a different price - it's one price if your data gets transferred. It's another price if your data is stored locally. But enabling people to have that choice is something that's going to be certainly a competitive advantage for businesses.
Annick O'brien: And businesses that show that they are sensitive and aware of the privacy concerns of consumers are organizations that are going to come out ahead. And indeed, it's not just in the area of privacy by design when it comes to Big Tech or companies that are developing technological products, but indeed services - organizations, airlines, restaurants - organizations that service consumers who have privacy patches on their website, who have areas where when you log in to your own personal area, you can control your data. There are many, many different ways of showing consumers that you take privacy seriously.
Annick O'brien: And one thing we have seen, which is very interesting, is that organizations that come out ahead of data breaches, organizations that say, we take privacy seriously, we are interested in this, we make this privacy pledge, we are transparent, this is what we do - when those organizations - and I say when and not if because it's a case of when when it comes to data breach for every organization - when those organizations have an incident, coming out ahead of it and saying, this is important for us and this thing went wrong, this is how we're going to fix it, actually works in their favor, much more so than organizations who try to ignore the issue. It works in their favor as it comes to consumer trust, but also as it relates to regulators and the FTC, who always want organizations to be as transparent and accountable as they can possibly be. So these are two very important reasons why privacy by design and default is certainly increasingly a competitive advantage for businesses.
Dave Bittner: Ben, what do you think?
Ben Yelin: Really interesting interview. I mean, I think a theme that she gets to that we've talked about is having these minimization procedures in effect isn't an altruistic act. It's trying to get an organization a competitive advantage. And I think that's the only way that we're going to see widespread adoption of this type of data minimization. So I thought it was really interesting in that respect.
Dave Bittner: Yeah. Again, our thanks to Annick O'Brien for joining us. She is a global data privacy lawyer at CybSafe, and we appreciate her taking the time.
Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.