Caveat 11.17.22
Ep 150 | 11.17.22

Ensuring privacy is baked in every step of the way.

Transcript

Chris Handman: The companies who have begun to adopt these programs, who have begun that shift-left movement, have already been able to develop in a more robust way, are far more prepared for laws that come down the pike and really sit in a much greater position to take advantage of legal change.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben shares the story of the FBI potentially deploying spyware. I've got the story of Google's nearly $400 million settlement over location privacy data. And later in the show, Chris Handman from TerraTrue on his work transforming legal teams into advocates and collaborators to ensure privacy is baked in every step of the way. 

Dave Bittner: While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's jump into our stories this week. Why don't you start things off for us here? 

Ben Yelin: So mine comes from The New York Times in their technology section by Mark Mazzetti and Ronen Bergman. And this is about a little dispute that's developing between the leaders of the FBI and members of Congress about spyware, specifically Pegasus from the NSO Group, which we've talked about a number of times on the show. So Christopher Wray, the director of the FBI, was asked at a hearing several months ago whether the FBI has used Pegasus, deployed Pegasus to spy on either U.S. persons or overseas targets. And Director Wray admitted that the FBI was Pegasus curious, if you will, but only for research and development. 

Dave Bittner: (Laughter) Who among us has not been, Ben, really? 

Ben Yelin: Exactly. So they got a license, but it was just to do, quote, "research and development." 

Dave Bittner: OK. 

Ben Yelin: He said at the hearing, to be able to figure out how bad guys use it, for example. There was a request put in under the Freedom of Information Act to get more details on how the FBI either used or did not use Pegasus. And according to documents obtained by The New York Times, who put in the Freedom of Information Act request, the FBI was quite interested in not only purchasing a license for Pegasus but for deploying it. They wanted to use it in some of their own federal criminal investigations. They went to the point of preparing briefs on how the spyware could potentially be used, and they clued in prosecutors on how these types of hacking tools would need to be disclosed during criminal proceedings. If Pegasus was used to obtain evidence against a potential criminal, would they have to be notified, and under what statutory authority? So we have a problem here... 

Dave Bittner: (Laughter) OK. 

Ben Yelin: ...Because members of Congress, led by Senator Ron Wyden - who is often cited on this show because of his concern for digital privacy - kind of implying that Christopher Wray was lying at his congressional testimony. So, obviously, you are not allowed to lie at congressional testimony. You are under oath. 

Dave Bittner: Right. 

Ben Yelin: So the implication is that Christopher Wray committed perjury. I don't think you have a good perjury case here. Officials know how to be cagey enough to avoid a perjury charge. 

Dave Bittner: Right. 

Ben Yelin: There's no way you could prove that Director Wray purposely misled the committee. And it's possible that included in the definition of research and development is how can we use this for our own purposes to catch criminals? 

Dave Bittner: OK. 

Ben Yelin: The end of the story is that the FBI ultimately decided not to deploy the tool in support of criminal investigations. So there is some information here that's unknown to us. First and foremost, why they purchased the license. But then, what happened between the time they purchased the license and the end of 2021, when they made a decision not to use Pegasus? Was there a moral consideration? Or more likely, were they worried about bad publicity or potential legal liability that might hurt them in criminal investigations? 

Dave Bittner: Because by that point, Pegasus had kind of become too hot to handle around the world. 

Ben Yelin: Right. We had talked about stories on this podcast about how different countries had deployed Pegasus to spy on dissidents, protesters. It's a zero-click method of getting into people's personal devices, so it's quite intrusive. 

Dave Bittner: Right. 

Ben Yelin: So obviously, people in the privacy community take it very seriously. NSO is persona non grata among our friends at places like the ACLU and the Electronic Frontier Foundation. 

Dave Bittner: Yeah. 

Ben Yelin: I think we're going to see a battle between Congress and the FBI over the coming weeks and months about what happened here. I think there are further questions for Director Wray about why he potentially misled the committee on this, and more importantly, what was the FBI seeking to do with Pegasus? How were they seeking to use it? And why, ultimately, did they decide not to deploy it? And I think those questions are very much unanswered at this point. 

Dave Bittner: So in your opinion, what is your take on the FBI's ability to use a tool like Pegasus legally? 

Ben Yelin: I don't understand why they would not be able to use it. It is spyware. But the FBI has all different types of surveillance tools that can get into people's devices, specifically if you got a warrant. Now, I don't know if they were considering warrantless use of Pegasus. They would probably have to get some type of approval from a judge. But just because it's spyware, it doesn't mean, per se, it's illegal to use it on U.S. persons. If you can get judicial approval, yes, it's a very intrusive form of spying, but is it any more intrusive than other types of surveillance we talk about all the time? 

Dave Bittner: Like a wiretap, yeah. 

Ben Yelin: A wiretap... 

Dave Bittner: Sure. 

Ben Yelin: ...Or a geofence warrant for a limited geographical area or a license plate reader. I think the Supreme Court would look at this with very watchful eyes because we're talking about getting on people's personal devices. And we know from Riley v. California that people have a robust expectation of privacy in the contents of their smartphones... 

Dave Bittner: Right. 

Ben Yelin: ...For example. So I think they would take that seriously. That does not mean that in no circumstances would the FBI be able to use - legally deploy this tool. I do think if the FBI had chosen to use it, we would have had a type of case where they found - they used Pegasus to get on to a device, found illegal activity, charged the person, disclosed to the person, hey, we got evidence by the use of this spyware tool, and then the person would file a motion to suppress that evidence. And we could have gotten a really interesting Fourth Amendment hearing on the relevant issues here. Unfortunately, for us legal analysts, that's... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Not going to happen because the FBI ultimately decided not to use this information. So I don't see why the use of spyware would necessarily be illegal, especially if it were done pursuant to a warrant. I think this is something that we can't just wave away and say, well, they decided not to use it here. So there's a precedent against using spyware. I don't think that's the case. 

Dave Bittner: So then what is Congress' interest here? If there's no illegality, is this just a general policy question? 

Ben Yelin: It's a policy question that Congress itself could resolve. I mean, they could do any number of things. I don't think they would outright ban the FBI from using this type of spyware, although they could. 

Dave Bittner: Right. 

Ben Yelin: They do have jurisdiction over a federal agency. I think what would be more likely is that they could limit the circumstances in which spyware was used, limit the categories of crime. So you can use Pegasus or similar types of spyware but only for terrorism, you know, racketeering, any type of violent conspiracy - those types of crimes. 

Dave Bittner: And you have to get some kind of judicial oversight... 

Ben Yelin: Exactly. 

Dave Bittner: ...That sort of thing, yeah. 

Ben Yelin: So you - it would have to go through an Article III court, as opposed to, say, let's use Pegasus to nab somebody for tax evasion or financial fraud. 

Dave Bittner: I see. Yeah, yeah. 

Ben Yelin: I think that's where Congress could play a role and say, if you are going to use this type of technology, you can only use it when absolutely necessary to stop criminal acts of violence. I think we could see something like that as part of a broader data privacy measure. But I don't think we're at the point where they would specifically outlaw this type of surveillance. They are generally loath to do something like that, just since they don't want to hamstring federal investigators. 

Dave Bittner: Right, right. And what about - for Director Wray himself? I mean, is he in for a stern talking to or a slap on the wrist? What do we expect he'll... 

Ben Yelin: He'll be yelled at... 

Dave Bittner: Yeah (laughter). 

Ben Yelin: ...At a future congressional proceeding. 

Dave Bittner: OK. 

Ben Yelin: But he's used to it. 

Dave Bittner: Yeah. It's part of the job. 

Ben Yelin: He's been - yeah, he's been yelled at by people all across the political spectrum. 

Dave Bittner: Right. 

Ben Yelin: And he's not going to be charged with perjury. 

Dave Bittner: Yeah. 

Ben Yelin: They're just going to accuse him of misleading the committee. He'll say what he's going to say. Probably - what he's said in quotes to The New York Times here - I wasn't trying to mislead you. We were using this for research purposes. We never actually deployed it. But yeah, I mean, if you want a WWE-style congressional hearing... 

Dave Bittner: (Laughter). 

Ben Yelin: ...You might get one in the next Congress. So... 

Dave Bittner: OK, right. 

Ben Yelin: ...We'll look out for that can't-miss C-SPAN TV. 

Dave Bittner: Right. Those of you who are on pins and needles waiting for these sorts of conflicts, start popping your popcorn now (laughter). 

Ben Yelin: Exactly, exactly. 

Dave Bittner: All right, interesting stuff. We'll have a link to that story in the show notes. My story today - I think it's fair to say this is this week's big one. 

Ben Yelin: Yep. 

Dave Bittner: (Laughter) So... 

Ben Yelin: Sometimes Dave and I pick the same story... 

Dave Bittner: Right. 

Ben Yelin: ...And one of us has to do something else. 

Dave Bittner: Yeah. 

Ben Yelin: That's what happened here. We both picked this one. 

Dave Bittner: That's right. And I pulled rank, and I said, Ben, go find yourself another story (laughter). 

Ben Yelin: Seniority is tough, you know? 

Dave Bittner: I know, right? So this is a - I'm - I chose the coverage from the record - our friends over at Recorded Future. This is written by Jonathan Greig. And many people are covering this today. This is - Google has to pay nearly $400 million over deceptive location tracking practices. So Google has agreed to settle with a number of states, 40 states, over revelations that it tracked users' locations, even when it was explicitly told not to do so. What's going on here, Ben? 

Ben Yelin: So 40 state attorneys general - I always love the opportunity to get that one right. 

Dave Bittner: Yeah. 

Ben Yelin: A lesser man would have said attorney generals, but... 

Dave Bittner: Yes, like me. Yes. 

Ben Yelin: Yeah. 

Dave Bittner: I would have said that, sure. I don't have one of those highfalutin law degrees like you do, Ben (laughter). 

Ben Yelin: That's really the only thing they teach you in law school is how to say attorneys general. 

Dave Bittner: Right. 

Ben Yelin: But there are 40 of them, which is quite a coalition, that filed a lawsuit against Google, accusing them of misleading location tracking practices. Basically, people were under the impression, whether they read the EULA or not, that when you did things like turning off Wi-Fi or deleting applications or logging out of something like Google Maps, that Google was no longer collecting your location. And it turns out that they were much of the time. And this is a deceptive trade practice... 

Dave Bittner: Yeah. 

Ben Yelin: ...Which is why state attorneys general got involved. They sued, I think, seeking to achieve a settlement and also to get Google to change its behavior. The settlement is quite large. It is $391.5 million, as you pointed out. These are going to go into the state coffers of 40 separate states. I don't think that's going to make a huge difference in anybody's, you know, long-term budget outlook. 

Dave Bittner: Yeah. Yeah. 

Ben Yelin: But it's a way to hold Google accountable for their misleading actions. And then I think, more importantly, Google is going to be forced to correct some of the practices that are mentioned in the settlement. 

Dave Bittner: Yeah. 

Ben Yelin: They've said that they've already done that. One of their spokespeople, Jose Castaneda, said that consistent with improvements we've made in recent years, we have settled this investigation, which was based on outdated product policies that we changed years ago. 

Dave Bittner: Right. Old news, Ben, old news. Nothing to see here (laughter). 

Ben Yelin: Exactly. And I take them at their word that they have changed those policies. They don't want to subject themselves to further legal liability. 

Dave Bittner: Yeah. 

Ben Yelin: I think this is the largest settlement of its kind on a data privacy issue against Big Tech. 

Dave Bittner: Right. 

Ben Yelin: So that's one of the reasons why it's very significant. Another is that there's this vacuum - we'll talk about this in the context of the interview that you do - at the federal level in protecting data privacy. And so it's the states who have had to take a prominent role. 

Dave Bittner: Yeah. 

Ben Yelin: And I think it's really interesting that 40 states with ideologically diverse governors and attorneys general were able to come together and put together this lawsuit and get this settlement. So it's very significant. 

Dave Bittner: Yeah. This article points out that just in the third quarter alone, Google brought in over $54 billion in advertising revenue. So when you look at a quarterly income statement of $54 billion and then a $400 million settlement, on the one hand, as you mentioned, folks are pointing out that this is a very large settlement, the largest ever. Is this large enough to get Google's attention? (Laughter) You know what I'm saying? Like, does it - obviously, it's not going to put them out of business. You know, they're not... 

Ben Yelin: No, it is not. 

Dave Bittner: They're not going to cut off, you know, doughnuts in the break room or something over this. But it's - still, it ain't nothing. 

Ben Yelin: It's not nothing. In the grand scheme of things for Google, as you say, it's not going to really affect their bottom line very much. 

Dave Bittner: Yeah. 

Ben Yelin: we're talking about a fraction of a percent, maybe 0.5% of their revenue. 

Dave Bittner: Right. 

Ben Yelin: And that's just their monthly ad revenue. So, yeah, it's going to be a drop in the bucket. 

Dave Bittner: Yeah. 

Ben Yelin: I think this is more about publicity. We've seen other Big Tech companies try to get to the forefront of data privacy issues and hold themselves out as the protectors of your data, of your information. Certainly, Apple has taken the lead among the big four tech companies in doing that. 

Dave Bittner: Right. 

Ben Yelin: I think for Google, this is a way for these attorneys general to say, we got a metaphorical scalp here and have gotten them to admit wrongdoing and to change their practices and for Google to say, mea culpa. This is - we're going to pay this $400 million. We're going to pay out the largest settlement of this type on data privacy issues. And we're going to change our practices. So it's an opportunity for them, in the court of public opinion, to kind of regain some trust by taking this issue seriously and not fighting it tooth and nail through the court system, you know, for the next 10 years, as they probably could have tried to do. 

Dave Bittner: Yeah. This article also points out that Google settled with Arizona back in October for $85 million and that Arizona, Indiana, Texas, as well as Washington, D.C., have brought their own individual lawsuits against Google as well. So this isn't necessarily the end of this for Google. 

Ben Yelin: Certainly not. If I were in the legal - office of the legal counsel at Google, I would certainly be looking for an advance check on my next payment... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...Because there is a lot of potential legal liability here. And they are a ripe target for a number of reasons. One, Big Tech companies are politically helpful targets. They're not winning any popularity contests. Two, you can sue them for a lot of money because they have a lot of money. 

Dave Bittner: Right. 

Ben Yelin: And you can prove damages by showing that these are either deceptive trade practices, or they are violating people's state constitutional rights or federal constitutional rights. So you can see why states would be so eager to initiate these lawsuits. Yeah, so even though this is a large settlement, this is not the end of Google's legal liability. If it's not this location tracking issue, it's going to be something else. When you have such a large market share and you do so many things that could potentially violate antitrust laws, labor practices, their own privacy, state privacy legislation, you're opening yourself up to a lot of potential legal liability. 

Dave Bittner: What about the other Big Tech companies? If I'm Facebook, if I'm Apple, Microsoft, you know, any - the usual suspects, does this have their attention as well? 

Ben Yelin: Oh, yeah. I mean, they're coming after you. I think Google's was particularly egregious because of the nature of the EULAs themselves and how nonspecific they were about the type of location tracking that took place and because some of the applications that were the main drivers of this improper location tracking were Google applications, like Google Maps, for example. But, yes, these other Big Tech companies are certainly not out of the woods. It shows that courts are taking violations of data privacy seriously. States now have these additional tools. California has the CCPA. We have a Virginia law - which they can use as causes of action against Big Tech companies. So, yeah, I mean, I would not be sitting comfortably on my couch if I'm one of the other companies and saying, ha-ha, Google... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...Sucks for you. Yeah, they're not coming for us. 

Dave Bittner: Right, right. 

Ben Yelin: You know, they're coming for you. 

Dave Bittner: Yeah. 

Ben Yelin: You're next. 

Dave Bittner: Yeah, interesting. All right. Well, we will have a link to that story in the show notes. 

Dave Bittner: And we would love to hear from you. If there's something you'd like us to consider or to talk about on the show, you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Chris Handman. He is from a company called TerraTrue. And our conversation centers on his work transforming legal teams into advocates and collaborators to make sure that privacy is baked in every step of the way. Here's my conversation with Chris Handman. 

Chris Handman: With the privacy landscape, when you think about where we are today - at least here in the United States - we still are largely governed by a kind of free-for-all. There is - as of today, at least - no federal privacy legislation to speak of. There are a handful of state laws that have recently come down the pike, starting first in California and sort of extending eastward into Colorado and Virginia and a few others - about a half-dozen states at this point. And all of those states were taking their cues, not from Congress but from the EU, which famously passed the GDPR in 2018 when it came effective. And what we are really dealing with today is still this privacy revolution that remains in its infancy. Laws still are forming. Regulations are inchoate. And companies, as well as users, are still struggling to navigate this landscape that - here in the United States, at least - again, remains this patch quilt of kind of inconsistent and not terribly robust protections around data. And what that leaves us with is - both industry as well as individuals - not quite certain kind of where the future of privacy is going to be. 

Chris Handman: I think where that future is going to be is increasingly up and to the right in terms of regulation - and for good reasons - and increasingly prescriptive around the types of data that companies are allowed to collect and process and share and limitations on how they can do those things. So at a high level, it remains a bit of a free-for-all, which is why we're now governed largely by this world of sort of consent and notice through privacy policies. And increasingly, that regime will likely fall to a regime where you are governed by true regulations that are prescribing how that data can be handled. But today, we're still sitting on the cusp, waiting for Congress, perhaps, to jump into that fray and adopt those national standards. 

Dave Bittner: Is it reasonable to say that on a consumer level, folks tend to be a little cynical about this? I mean, I know, you know, personally, every time I find myself having to click through some kind of EULA, I kind of roll my eyes and wonder what it is I'm giving up this time. 

Chris Handman: I think that's right. And I think that is a symptom of the legal rules that we've had in place, if you even want to call that. For the most part, what privacy in the United States has reflected is really this notion of you cannot commit an unfair trade practice, right? The - kind of the only traffic cop at the federal level when it comes to privacy is the Federal Trade Commission. But they don't have a mandate to regulate privacy, per se. What they do is have a mandate to regulate unfair trade practices. And it's under that rubric that the FTC has come after all the kind of, like, you know, famous names out there, from Facebook to Twitter onward, where they have said, look, you said in your privacy policy you were doing these three things. And it turns out you're doing a lot more than that. And that was unfair to the consumer. Now, this is - as you said - so what happens is companies get increasingly prolix with their privacy policies. They run for what feels like chapters. "War and Peace" looks like a dainty read compared to a lot of these privacy policies. 

Dave Bittner: Right. 

Chris Handman: And, of course, nobody bothers to read them. And so it ends up just being this sort of cat-and-mouse game between the regulatory authorities who peruse them and the lawyers in house who are trying to figure out, OK, how can we make sure that we are protecting ourself? - obviously trying to be as transparent as possible. But at the same time, until industry or regulations change where it's no longer this kind of, like, consent-based system and the sort of - as you said - the sort of, like - it's a bit of a fiction, right? Like, the idea that consumers are carefully scrutinizing every last jot and tittle of a privacy policy before clicking, yeah, I agree is - of course, no one believes that. And so I think that is where regulations have said, look, we need to come up with a better system than that. 

Dave Bittner: What about the organizations who are creating these products? I mean, do - let's say an organization is trying in good faith to do the right thing here. In this free-for-all, as you describe it, I mean, do they find themselves as a - at a competitive disadvantage? 

Chris Handman: Yeah. And so I want to be very clear. I think the vast majority of companies out there are trying to do the right thing. And it is unfair to the industry, as well as the citizens of the United States, that we live in these policies. I think everyone is clamoring for more clarity and really more certainty around what is privacy, and how should we think about developing and foreseeing and kind of fomenting this culture of privacy. So, yeah, I think for those companies, the absence of those regulations, the absence of good guidance, does create a bit of a frustration. And it can create, you know, competitive concerns because when you are dealing with uncertainty - right? - that makes you either risk averse, or it forces you to court uncertain kind of obligations, or you may end up forestalling certain products out of sorts of concerns. 

Chris Handman: And this is just a general sense of, like, how will this play out in the marketplace? So I think that overall, you have a lot of companies, and a lot of consumers, who probably equally share this. And I think that's why, in the last few months, you've seen a surprisingly broad-base bipartisan move across both houses, both parties, to come to some agreement of a fairly robust federal privacy law. And, of course, anything remains to be seen there in any Congress, but with any luck, we can start to bring some of that clarity. 

Dave Bittner: Once we get some of that clarity, how do you envision a process where this becomes just, you know, part of the routine? How do organizations go about baking in these sort of things from the get-go? 

Chris Handman: And that's really what it gets down to. Privacy, when done properly, is a motivation, you know, from companies wanting to do the right thing, and understanding the processes, the cultures, the mechanisms and tooling to be able to get privacy right. And the only way you can really think about privacy in this day and age - being able to keep pace with a fast-moving iterative lifecycle of software development - is to, you know - this is the phrase, like, shift left, right? We know about the concept in the security space about shifting left. Moving regulation and testing and all sorts of scrutiny further into the ideation and development cycle, and - as opposed to this kind of reactive, after products go out the door, you know, take a look. 

Chris Handman: And I think privacy has historically occupied this almost rightward tilt on that continuum. It's a very reactive, very siloed type of discipline in the past. And I think what companies have increasingly come to embrace is this notion of shifting privacy left. Some have called it, like, privacy by design. But I think that has sometimes this, like, almost academic tone to it. And I think what privacy needs to do - and what a lot of companies are starting to recognize - is move privacy from this siloed compliance-heavy idea into sort of a forward thinking, how can we enhance the products from the get-go? How can privacy be a component of the way we enhance and develop our products? And that shift in thinking has already, I think you see, at companies across the board, develop richer, better privacy-protective products. 

Chris Handman: And in fact, you kind of see it now manifest in really unique cultural ways. You know, look at Apple, for example, when they're advertising iPhones, right? They are having national campaigns built around, really, one value prop, right? This iPhone will protect your privacy. And that is a unique change and, I think, the zeitgeist, the way we think about privacy, the way companies develop products. And so as companies look to enhance that privacy posture, to have more agility as new laws come down and have to adapt to new regulatory rules, having privacy built in, this proactive shift-left mentality, is going to be a really important way of guiding those future developments. 

Dave Bittner: What are your thoughts in terms of companies preparing for, you know, whatever regulatory regime might be coming? Are there ways that they can have their legal team, their software developers, you know, at the ready to be prepared for this stuff? 

Chris Handman: Yes, and that really gets back to that that shift-left idea. If you have privacy built into that framework of your software development, if privacy has a seat at that table early and often, right from the moment of ideation, when the germs of great ideas start to blossom and long before those products get coded and get shipped out the door, it is a lot easier for that collaboration between the people who build products and the people who review those products to get together, to collaborate, to understand, to speak the same common languages. Now, there are a lot of tools and a lot of technologies that can help promote that type of collaboration. But first, building and breaking down those sort of cultural and technological barriers to that shift left and that collaboration is a key part of that. 

Chris Handman: The other part of that, though, is then developing just first principles around how we think about privacy. The reality with privacy at the end of the day is that despite all the overlapping laws and despite this kind of proliferation of regulation across the globe, there are many, many first principles that are largely the same. And it comes down to concepts like data minimization, thinking through consent, understanding sensitive types of data, being clear in how that data is being used to consumers, the risk profiles that certain uses or combinations of data can be. Those are a lot of principles that will apply across the board. And baking that into the way companies think about that, from that get-go, is a way of really ensuring a sort of agility so that no matter where the laws go, you already have these first principles in place that make it that - grace-note changes that are really easy, as opposed to having to lurch from regulation to regulation and constantly try to guess at, OK, what do we have to do now? 

Dave Bittner: You know, you're using the term collaboration, which I like, but I can imagine that there are lots of organizations out there who, from the developers' point of view, they look at the legal team as almost being adversarial. You know, they're the ones - the department of no, throwing up, you know, roadblocks and speed bumps. How do you execute that culture shift to make it a true collaborative effort? 

Chris Handman: It's a great point. And I think one of the fears that, I think, most modern legal teams have is that they're going to be viewed as the place that, you know, good ideas go to die. And it is precisely that concern that, I think, is one of the biggest impediments to developing the types of privacy programs that are effective and dynamic and sort of well-suited for today's environment. And I think it begins with trust. A legal team, a privacy team that goes into a product team or an engineering team and starts reciting chapter and verse about Article 39 of the GDPR or, you know, some obscure subsection of the CPRA is very unlikely to garner the types of trust. You need to speak about privacy in terms of product and the way privacy can enhance the product, the goodwill, the types of proactive approaches to the way we want to think about our consumers that I think product people tend to want to pride themselves on. And it is a matter, then, of meeting them where they work, right? That is both a virtual and a sort of physical manifestation. 

Chris Handman: It's trying to work in the same tools. It's trying to go to those standups, try to be involved in those specs or confluence docs or wherever they happen to be iterating on these concepts and then gradually creating that culture that says, hey, my role here isn't to veto. It's not to flyspeck what you're doing; it's to really help you understand perhaps unintended or unseen consequences of using a type of data. There's a lot of uncertainty around, like, even what data we are using. It's remarkable. When you start talking to some product folks, they may not even appreciate all the types of data that is being collected or may not appreciate that this is data that can actually be repurposed to specifically target individuals. 

Chris Handman: And so there's an educational process. And as you begin to talk in those pragmatic terms, I think those teams come to appreciate the value that legal and privacy teams can impart to the way they build their products. But that's really - the emphasis is on building products as opposed to checking them off for, like - going through a regulatory box-checking exercise. And so it's a matter of tone. It's a matter of culture. It's a matter of emphasis. But I think when you combine those, the privacy teams have a very unique ability to become players in that development process. And if you can't do that, then that whole concept of shifting left, or privacy by design or whatever rubric you want to put this under, it becomes completely illusory. And you really do then default to the old world of just privacy as being this sort of compliance checkbox. 

Dave Bittner: And I suppose in that case, the companies who do it well will have the competitive advantage. 

Chris Handman: A hundred percent. And you see this already. And I think companies that have - that are well known already for having dynamic, modern kind of progressive notions of privacy - those that have invested in privacy managers, and CPOs and others who are really building out robust privacy programs already differentiate themselves in the market. You see it in their products, the way they speak about privacy - whether it's their blogs or whether they are just talking in their - in modals, and pop-ups and emails to their consumers. There's a - just a certain sense of sophistication. And I think it starts to develop that trust. And again, just getting back to - the Apple example is just one of the most graphic ones. But you begin to see the way companies understand the importance of talking about privacy, of giving consumers some confidence that, hey, we are thinking about this. 

Chris Handman: And you can't just do that as a mere marketing sham, right? This does - I think people can see through this. And they can see it in their apps. They can see how it's using - how data is being used. And that is, the companies who have begun to adopt these programs, who have begun that shift left movement, have already been able to develop faster, develop in a more robust way, are far more prepared for laws that come down the pike, and really sit in a much greater position to take advantage of legal change. Because, again - not to be cynical about any type of legal change - but legal changes can have competitive effects. They can have even antitrust effects, right? And the more that companies have the flexibility and the infrastructure to adapt to these changes, that always redounds into shipping faster, shipping better and ultimately, be more profitable. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: I mean, I think it fits well with our discussion today on the vacuum at the federal level as it comes to data privacy legislation. It's forcing companies to take the lead themselves - both to become industry leaders in data privacy, but also to head off potential legal challenges. I think life would be a lot easier for these companies if there were one federal standard. And the reason there's this impetus to do this work at companies big and small to get them to update their privacy practices is because of that vacuum at the federal level. So I thought it was an interesting conversation in that respect. 

Dave Bittner: Yeah, it's fascinating to me how much it seems as though organizations now - we were at the point where they've realized that you can't just bolt on privacy after the fact - or that, if you do that, chances are you're going to end up with a suboptimal outcome, right? It has to be - your privacy folks have to work with your security folks from the get-go. And that's what's going to - it's going to be most cost effective. It's... 

Ben Yelin: Yup. 

Dave Bittner: ...Going to put you in the best - you know, that's your best case scenario. 

Ben Yelin: Right, it has to be part of your general organizational planning because you otherwise might expose yourself to liability or bad publicity. 

Dave Bittner: Right. 

Ben Yelin: I mean, I talk about that in my other work in the context of emergency management. You don't want to be in the news because you're the company that did - or the government that didn't properly prepare for a hurricane or a cyberattack or a power outage. 

Dave Bittner: Right. 

Ben Yelin: And I think the same exists here. We've seen companies both in the public and private sectors suffer substantial reputational harm because they haven't done a good job protecting their data. 

Dave Bittner: Yeah. 

Ben Yelin: So I think, yeah, it should be part of the onboarding of new employees. It should be part of the organization's culture. It should be part of - you know, just as regular as an HR training, privacy should just be part of an organization's culture. 

Dave Bittner: Yeah. All right. Well, again, our thanks to Chris Handman from TerraTrue for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.