Caveat 7.15.20
Ep 37 | 7.15.20

Law takes time.

Transcript

Hilary Wandall: Technology just continues to evolve. And as technology evolves, it introduces new privacy challenges. And it's difficult for the privacy laws to keep up with that.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hey there, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, I've got the story of police using smart streetlights to investigate protesters. Ben has the story of major tech companies responding to China's new Hong Kong security law. And later in the show, my conversation with Hilary Wandall. She's a senior vice president of privacy intelligence at TrustArc. We're going to be discussing health care privacy issues. While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right. Ben, we've got some interesting stories to share this week. Why don't you start things off for us? 

Ben Yelin: Sure. So my story comes from Vox's Recode blog. And this is about many tech companies but particularly TikTok pulling out of Hong Kong because of a new national security law, a really repressive national security law, that went into effect on the 1 of July. So this law criminalizes - and I'm quoting here - "secession, subversion, organization and perpetration of terrorist activities in collusion with a foreign country or external elements to endanger national security." But what that really means is it's suppressing protests. It's suppressing free speech. It's really a way to crack down on the democracy movement out of Hong Kong. 

Ben Yelin: And that's put some of these tech companies on notice. Many of them are generally cooperative with the Chinese government in Beijing. Some of them will say that they don't respond to all requests for data. They protect their user data. And that's true in certain circumstances. But other times, particularly companies like Apple, have complied when the local government in Hong Kong or the Chinese government has requested access to information. But I think because of what's turning into a major political backlash against this law, many of these companies, including TikTok, which I hear the kids use these days. 

Dave Bittner: (Laughter) Yeah. You know what? I checked with my son, who's 13. And, yes, indeed, he is a regular user of TikTok. 

Ben Yelin: Yeah. It's like once the old people started using Facebook, then the millennials, you know, started using Snapchat. And then the millennials started using Snapchat, Gen Z started using TikTok. So that's the way this all works. 

Dave Bittner: (Laughter) Yeah, yeah. It's the circle of life. 

Ben Yelin: In the circle of life. But yeah. So TikTok is saying it's not going to operate in Hong Kong, and it's probably the most bold move we've seen so far from any of the tech companies. They are certainly going to be sacrificing some of their bottom line. It's - you'll still have a large base of potential customers there. But this is sort of a moral stand that TikTok and some of the other tech companies are taking, you know, in order to protect its brand as a company that's interested in privacy, security and democracy. 

Dave Bittner: So what are some of the larger implications here? I mean, I see in this article that - they mention Facebook, for example. They say they're pausing their requests for user data from Hong Kong authorities. What does that really mean on the ground? 

Ben Yelin: The way this law works is the Chinese government is requesting data, all different types of data - whether it's location data or messages written on encrypted applications - as part of this repressive national security law. And so the practical impact is it's going to be a lot harder to enforce this law without the cooperation of these technology companies. 

Ben Yelin: Now, the technological realm is not the only way this law is being enforced. And the surveillance state in China is so robust that just because TikTok is pulling out of the market in Hong Kong doesn't mean that protesters are in any way protected in terms of their identity being revealed. But this is sort of a small, I would say more than a symbolic step. But it's a step that shows some solidarity on the part of these tech companies saying even if we are sacrificing some of our market share, we have values. These are basic values. We want to cooperate with the government - with any government - to the extent that we can. 

Ben Yelin: But what this law does is takes things a step too far. And as we saw with - a couple of weeks ago, we talked about how a succession of companies announced they were pausing research on artificial intelligence or facial recognition software, rather. And you kind of saw one company after another make these announcements. I think we're starting to see that here. There's sort of a social pressure that started with TikTok and has extended to Facebook, Google, Microsoft, Twitter and Zoom saying that they're at least going to pause responding to these government requests. 

Ben Yelin: And, you know, this is a new frontier in American tech companies because, you know, as I've said before, they've generally been pretty cooperative with China. And, you know, this is a way that they are trying to contest this sort of iron grip that China has in the technological realm over its citizens. 

Dave Bittner: A complication as well as - like, for example, you think about Apple and how much of their manufacturing takes place in China. So in a way, it seems to me, like, China has them over the barrel. It's like, great social policies you have here; be a shame if you couldn't make any more iPhones. 

Ben Yelin: Yeah. It's interesting. So they note in this article that Apple is more cautious than some of the other companies (laughter) we just mentioned, probably for the reason you're describing. Their quote was that they're assessing the new law - only went into effect a week ago, not that they haven't had time to study it. But they said that they haven't received any content requests. You know, they also said that, normally, in a given week, they probably would not receive requests directly from authorities in Hong Kong. 

Ben Yelin: But you're right. When so much of your supply chain is based in China, you are going to have to be particularly careful with actions that you take to confront the Chinese government and Hong Kong local authorities. I think that's certainly a reason why Apple is a little bit more reticent than these other companies. A lot of the hardware is made in China; some of the software is. I think there's a difference there. I think so much of what TikTok does is probably developed in Silicon Valley and not out of factories in China, and perhaps that makes a difference here. 

Dave Bittner: Well, it's interesting. I mean, it's sort of a window into this intertwined global economy - right? - that it's so much more connected in so many different ways than perhaps we think about. 

Ben Yelin: Yeah, absolutely. And I just think it shows that activism in some circumstances can really work. You know, I think the tech companies understand that there is a public backlash not just in Hong Kong but in the United States about protecting freedom of expression. And this is something in our country that's bipartisan. There's just been a lot of outrage at what's been going on over there. And I think tech companies are sensing that pressure, sensing pressure from their users and their advertisers who want them to express solidarity with the ideas of freedom of expression. And I think that's sort of what's underlying what we're seeing here. I think it's a actually very promising development that these companies are taking action, maybe against their own financial self-interest, to really express these democratic values - small-D democratic values right. 

Dave Bittner: Yeah. Right, right. Interesting. 

Dave Bittner: Well, my story this week comes from the Voice of San Diego website, written by Jesse Marx. And it's titled "Police Used Smart Streetlight Footage to Investigate Protesters." This story caught my eye because, evidently, the city of San Diego installed a bunch of streetlights - what are popularly called smart streetlights - and, you know, there's been this push over the past few years to implement smart city technology, which allows the folks who run cities to do so in a more efficient way. And these smart streetlights were sold to the folks who run the city. The people in government are saying, well, these streetlights will allow you to monitor traffic. They'll allow you to monitor weather - all sorts of data that these streetlights can gather that'll be helpful for running your city more efficiently and for the public good. 

Ben Yelin: Sure, yeah. 

Dave Bittner: (Laughter) Oh, Ben. Ben. Are you skeptical, Ben? 

Ben Yelin: I am. My cynicism has kicked in early. Usually, it takes us talking about a story for five minutes before I get cynical, but... 

Dave Bittner: (Laughter) Well, you're one step ahead of me here. So records that were obtained by the Voice of San Diego show that the San Diego Police Department made several requests and was able to look at the footage from these smart street lamps - they have video cameras on them, as well as they can record audio - to look at some incidences of vandalism and looting and destruction of property that went along with some of the protests that we've seen recently in many cities, but San Diego had them as well. 

Dave Bittner: And it's really sparked a discussion among the leaders in San Diego as to, first of all, was this what they intended for this footage to be used for? And to their credit, it looks like they may be putting some restrictions on what this stuff can be used for. So it's sort of a interesting development here. 

Ben Yelin: Yeah, so the mayor of San Diego - it says this at the end of the article - has offered to shrink the network of cameras - or dramatically shrink it, they said - and pay for it through parking revenue fees. There's actually supposed to be a debate on this in the coming weeks. So there is sort of a pushback against these smart streetlight cameras. I was struck by a couple of things here. The justification for putting them up in the first place reminded me of - remember the Bridgegate scandal a few years ago from New Jersey... 

Dave Bittner: Hmm (laughter). 

Ben Yelin: ...When the pretense was that there was going to be a traffic study? It seems to me that - and maybe I'm being too cynical here - but there's a pretense about, you know, studying traffic patterns in San Diego, when really what law enforcement is looking for is recorded footage of crimes or other undesirable activity being undertaken in the streets of San Diego. And similar to the stories we have discussed over the previous weeks related to these protests, I'm sort of at a heightened level of alert for all these types of surveillance because we're talking about First Amendment-protected activity. 

Ben Yelin: Now, some of the things that they're prosecuting here are not protected under the First Amendment. So they caught somebody shining a laser pointer at a police helicopter. You can't do that. 

Dave Bittner: (Laughter) Nope. 

Ben Yelin: That's not protected by the First Amendment. That's very dangerous, and obviously we want to see those people prosecuted. But for every one of those types of individuals that they prosecute, there are going to be others who are caught engaging in constitutionally protected speech through this ubiquitous network of smart streetlight cameras. And I think that's very concerning, especially since before all of this came to light, it seems that the San Diego Police Department wanted real-time access to these cameras, so that would allow them to be - have their eyes and ears all around the city of San Diego. And originally, they were only supposed to be used for a certain small classification of crimes, and it seems like that list has been rapidly expanding over the past several years. 

Ben Yelin: So, you know, as you said, the good news is there is a bit of a pushback here on both the San Diego City Council and from the mayor's office. The bad news, from the perspective of privacy, is that a lot of footage was obtained from these protests, and federal prosecutors, under the direction of the attorney general, have been very aggressive about prosecuting federal crimes. And it appears as if the district attorney in San Diego has given a lot of this footage to federal authorities. Some people are going to be criminally charged based on the surveillance and are going to be subject to some pretty significant penalties. 

Dave Bittner: What about the other argument, though, saying if these folks were, indeed, committing crimes, shouldn't the police be able to use every tool at their disposal? And these cameras that are there, I mean, that's just another useful tool for law enforcement to keep our community safe. 

Ben Yelin: Yeah, absolutely. I mean, this is always going to be a balancing act. You know, I'm not one who enjoys slippery slope arguments, but the logical extension of that is, you know, we're going to put a camera on every vehicle, every individual, every building in the city so we can literally catch everything. That certainly would assist law enforcement. It's something that most people probably would not be supportive of. 

Ben Yelin: So just because something might be useful as a law enforcement tool, and it clearly would be here, I think we still have to be concerned about it being overbroad, particularly when it's combined with a lot of the other surveillance methods that we've talked about a million times - aerial surveillance, body cameras, facial recognition. So, you know, when you put all of that together, there's sort of a sense among protesters that they're being watched. That can even include things like we talked about last week with the collection of location data, which can be used to obtain demographic information on them and other potentially private information. 

Ben Yelin: So, yeah, there's going to be a law enforcement justification for a lot of these things. I'm quite sympathetic to that. I want people who commit crimes to be prosecuted to the fullest extent of the law. But that just has to be balanced with what seems like a pretty intrusive surveillance method here. 

Dave Bittner: It also strikes me that I think one of the maybe subtle subtexts here is that this puts the surveillance technology on another department's budget, right? 

Ben Yelin: Yeah. 

Dave Bittner: It's not the police department that's spending the money here. So if someone were - I don't know - you know, auditing the police department, or what capabilities do you have, this wouldn't necessarily show up. 

Ben Yelin: Yeah, it's interesting, isn't it? Because we've heard this political movement over the last month or so about defunding the police. Well, you could defund the entire San Diego Police Department and there would still be these streetlight cameras because they're run out of a different department. So that's sort of an interesting point. It's a whole citywide system that contributes to this type of surveillance. And even, as is the case here, when their surveillance might not have been put into place in the first place specifically to catch people committing crimes, there might be other legitimate reasons for them. 

Dave Bittner: Right. 

Ben Yelin: I mean, certainly, catching traffic incidents is one of them, and that's, you know, why we have a lot of streetlight cameras in all cities across the country. The result here is that protesters in San Diego now should be aware that they are being watched, and whatever is captured on these cameras might be submitted to federal prosecutors, and maybe they'll think twice about going to that next protest... 

Dave Bittner: Right. 

Ben Yelin: ...Which is exactly what we don't want. We don't want people to feel that they are chilled from being able to express themselves in public. 

Dave Bittner: All right. Well, interesting story, for sure. And, of course, we'll have links to all of our stories in the show notes for this week. Ben, it's time to move on to our Listener on the Line. 

(SOUNDBITE OF PHONE DIALING) 

Dave Bittner: Our Listener on the Line this week is an email that comes from a listener named Ian (ph). And he writes - he says, with all the talk about the government requiring companies to provide a way for law enforcement to access encrypted data, I started to wonder if U.S. citizens have a right to encrypt their data and, if so, where would that protection come from? I've heard that certain grades of encryption cannot be exported because they're considered cyberweapons and appear on the U.S. Munitions List. Could some kinds of encryption therefore be protected by the Second Amendment as part of the right to bear arms? Thanks. Love the show. 

Dave Bittner: Interesting question, Ben. 

Ben Yelin: It's a great question. And Ian is really onto something because a lot of legal scholars have started to explore this question. The Second Amendment says that we have the right to keep and bear arms to protect ourselves. Now, there are a lot of complications with that. But at its base level, the idea, or at least the impetus behind that amendment, is the protection of oneself, one's family and one's property. There have been academic articles saying that that same logic could be applied to encrypted information. If people discover that encrypted information, it could be used as a weapon, and it would potentially expose an individual and one's family. And this listener is right that certain types of cyberweapons, encryption technology are included on these munition lists. So if they're included on these lists, shouldn't they potentially be counted as arms for the purposes of the Second Amendment? 

Ben Yelin: The answer is that courts have thus far not recognized this constitutional right as it relates to encryption. We've had a lot of encryption cases based on the First Amendment because it's people's own creation often that we want to protect their free expression, that they've developed a new encryption method. And then there's the Fifth Amendment issues we've discussed extensively on compelled decryption. But there's been less of a discussion on the Second Amendment. It really has not made it into court cases. I'd love to see a test case of this come up. 

Ben Yelin: But it is in academic literature. It's certainly worth a Google search. There have been - I was just reading a academic paper from 2017 from the Hastings Law School in San Francisco by a professor there on this very question. So it's something that - you're not the only one who has had this really interesting and provocative question, and it's definitely something I'm going to be paying attention to going forward. 

Dave Bittner: It's interesting to me, too, because, you know, even when you think about the Second Amendment, there's a difference between a handgun, a hunting rifle, a machine gun and a rocket launcher. The Second Amendment entitles me as a citizen to have or carry in different situations and different states and so forth. And I wonder how would you split up different types of encryption if you did consider them munitions. You know, what point does strong encryption become the equivalent of a howitzer, you know? 

Ben Yelin: Right, or an Uzi or something. Yeah. 

Dave Bittner: Yeah, yeah. 

Ben Yelin: It's a great question. I mean, there are a couple of principles at play here. The Supreme Court has said, and this stems out of their District of Columbia v. Heller decision back in 2008, that just because a weapon wasn't around at the time of the Second Amendment's drafting, doesn't mean that that weapon is not protected by the Second Amendment. So certainly, that would apply to encryption technology. But in that same decision, they said law enforcement can restrict certain types of weapons. So if a city wants to ban bump stocks, AK-47s, they have the constitutional authority to do so. And that was in an opinion by Justice Scalia, certainly no liberal squish. 

Dave Bittner: Right (laughter). 

Ben Yelin: So, yeah. I mean, that's fascinating as it comes to encryption. I mean, we've seen laws proposed in this session of Congress that would crack down on encryption methods, that would provide the government a back door. I mean, perhaps part of that effort would be to define, you know, what types of encryption methods would be protected under the Second Amendment and which would not. I mean, I'm wondering if any legislators who have been part of those efforts have even considered this question 'cause it certainly struck me when we got this question from a listener. 

Dave Bittner: Yeah. Well, our thanks to our listener, Ian, for sending that in. We would love to hear from you. We have a call-in number. It's 410-618-3720. You can leave your question, and perhaps we'll use it on the air. You can also email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Hilary Wandall. She is the senior vice president of privacy intelligence at TrustArc. And we discuss a number of different issues about health care and how things are sort of being brought into focus here as we continue to make our way through the COVID-19 pandemic. Here's our conversation. 

Hilary Wandall: I've been in privacy, actually, since 1999 - so over two decades. And it's been fascinating to see how it's evolved. My initial foray into it was really more in the genetic privacy space with some legislation that was proposed in 1997. That's what piqued my interest. But I really spent time trying to understand privacy through the lens of online and COPPA back when it was first introduced in 1998 and then the regulations in 1999. 

Hilary Wandall: And I have been involved in privacy through various different paths of its evolution over the last two decades - and so from health privacy and dealing with HIPAA and how that affects clinical trials to online privacy from the perspective of the initial websites that many of us were putting up in the early part of 2000. I remember the first online advertising banner advertising that we were doing and some of the privacy issues that that triggered with respect to cookies and trackers all the way back then, the introduction of the first 3P policies and whether there was an easier way to communicate to someone's browsers or a person's browser, whether or not there were the right kinds of privacy controls and in the policy - many, many different attempts over the years. 

Hilary Wandall: Fast-forward to today. Where are we? It's fascinating to see how so much of where we started is still - remains to be addressed as effectively as potentially it can be. And I think in part that's because technology just continues to evolve. And as technology evolves, it introduces new privacy challenges. And it's difficult for the privacy laws to keep up with that. 

Hilary Wandall: So it really becomes quite incumbent on organizations to find ways to anticipate what the privacy expectations of people will be and to understand the risks associated with those expectations and things that could go wrong, particularly in the online very connected space and how best to mitigate those risks. And mitigations can come from law, certainly, but more oftentimes it comes from understanding what people are fearful about, what can happen and go wrong with respect to information being misused or accessed by the wrong people and putting in place the appropriate safeguards and controls to try to mitigate those risks. 

Hilary Wandall: So I'd say, where we are today is very much where we started. And that is that there is still a lack of standard privacy requirements all over the world, and yet data is moving faster and faster. And it's important for organizations to have ways to understand those risks, regardless of whether or not there is a law that applies to put in place mechanisms to try to protect individuals from being harmed or their data being misused. 

Dave Bittner: Yeah, I can't help wondering if there is sort of the fundamental mismatch between the pace at which we see change on the technology side of things and the policy side's ability to keep up - the pace at which legislation makes its way through the various governments around the world - if we're dealing with two things that are running at fundamental different speeds. 

Hilary Wandall: Indeed, they are. And I think they always will. That's just my honest opinion on the matter. Having been in this space as long as I have, the technology wants to evolve quickly; people are very creative. Certainly, many, many technologies make significant improvements in our lives, whether it's in health care, related technologies or being able to have the first commercial space aircraft go to the International Space Station just the other day. I mean, it's just amazing how technology is able to move, and data wants to move along with it. And the problem is that law and regulatory practices, they take time. 

Hilary Wandall: They're reasoned and thoughtful. They require evaluation by a variety of different stakeholders, whether directly in a legislative process or whether, really, working across national boundaries. I'm thinking, for example, of some of what's happened over the years with respect to international regulatory frameworks that actually allow for data to move across borders, whether that's the existing EU-U.S. Privacy Shield or the APEC Cross-Border Privacy framework or the OECD (ph) privacy guidelines that have been around since 1980 but have been amended since and are in the process of being reviewed right now. These things take time, and they can't necessarily keep up. 

Hilary Wandall: So it puts the burden on businesses to really be responsible data stewards and to be thinking about how, if we have the privilege of being able to innovate, do we do so in a way that's actually mindful of the responsibility we have back to the individuals whose data we're using? 

Dave Bittner: Yeah, that's a really interesting insight. I mean, again, I can't help wondering about, you know, how much of a traditionally what we relied on were social norms and - in terms of what we would or would not do when it came to people's privacy, gathering information and so forth. And you know, I wonder with some of these social media companies, some of the advertising platforms and so forth operating at the scale and velocity at which they do, have they lost touch with that? Have they lost touch with - you know, just because we can doesn't necessarily mean that we should? 

Hilary Wandall: That is such an interesting question and a hard one to answer, I think, for many. What I will say is I think that people - and this is based on my experience in dealing with lots of different players in the privacy space in my role at TrustArc and prior to that in serving as the chair of the board of the IAPP, being a longtime member of the IAPP - which stands for, by the way, I should say, the International Association of Privacy Professionals - and seeing many who are out there trying to do good and trying to do well but trying to manage the business models that have been put in place. 

Hilary Wandall: You know, the reality is that businesses oftentimes start with a particular business model in mind, and they grow based on that business model. And over time, they run into challenges, and so the predicaments you find yourself in oftentimes as a privacy professional or a lawyer or an ethicist for that matter working inside - many of these companies, they do have ethicists as well - is how do you help the organization, the business, continue to operate - its investors want it to be able to continue to operate and to be able to grow - while at the same time being mindful of the risks to people? 

Hilary Wandall: And I think, you know, many who serve in privacy roles within social media organizations who manage these platforms, they are striving to try to manage that balance with those who seek to grow for reasons of investments and those who seek to have their data be able to be part of the continued, ongoing innovation. And so, you know, you're balancing the interests of the people who are investing, the people who are innovating and the people whose data are used on those platforms. And, you know, I really have so much respect for those who try to serve in the role of being the arbiter, if you will, amongst those various different interests, serving within organizations and trying to make sure that they're doing the right thing. 

Hilary Wandall: And it's hard. It's very, very hard. I don't think that people always get it right. I think it's admirable when you see organizations who realize that they may have gotten it wrong and without the government having to come in and tell them they got it wrong seek to correct it and make it right themselves. And I think that's really - you know, in my humble opinion, I think that's really the responsibility that all organizations should have. If you have a license to innovate, you should at the same time be responsible - I'll come back to my point earlier - for using the data in an appropriate way. 

Dave Bittner: I want to dig in to some of the details of some of the issues we're dealing with because of the COVID-19 pandemic. And I know there are a number of things that you have your eye on in terms of - I don't know - I suppose things that have sort of had a light shone on them as a result of many of the changes that have taken place here. Can you take us through some of the things that have your attention? 

Hilary Wandall: Yeah. It's so interesting. So at TrustArc, we are in the unique position to really help so many different organizations around the world deal with the challenges associated with COVID-19. What's been interesting for us - I mean, very early on, we had many of our customers who use our research ask us, well, you know, what is this regulator saying? And what's that regulator saying? And what am I allowed to do with respect to even collecting information on whether or not somebody has been diagnosed? And if they have been diagnosed, what am I allowed to share? And you can start with some basic, you know, some fundamental tenets of what's appropriate. 

Hilary Wandall: In terms of data sharing, generally, do you need to tell somebody the identity of someone else? Is it possible, given the size of the organization, to even prevent someone from figuring that out? And then, you know, if you don't necessarily have good guidelines, what do you turn to? And so many people were coming to us and saying, can you track for us all these different regulatory guidances? 

Hilary Wandall: And as of this point, Dave, we have over 300 different regulatory guidances that we have tracked and actually created a way of helping organizations understand better, is it OK to take someone's temperature? And under what conditions is it OK to take a temperature? And can I actually keep that information? Do I have to destroy it right away? I mean, in some countries, you're actually not allowed to do that. As an employer, you're not allowed to do that. And people need to know. 

Hilary Wandall: So what's happened across the board is organizations have had to react very quickly to making sure that they can actually protect their workforce and at the same time, respect to the extent that they're remaining open, be able to protect the safety of the people that they are serving, so their customers. And they need to protect the data at the same time. 

Hilary Wandall: And so there's this balancing, if you will, between safety and privacy. And people at the very, very early stages are saying, oh, you know, you can't have safety and privacy at the same time. And what we've seen is a really good faith effort on the part of many organizations to try to balance those two things. Many lawyers that I talked to are trying to wrestle with, you know, do we require everybody to actually screen every morning their symptoms, take their temperature and report in? Or do we just have them say if they actually have a fever or they have the symptoms and if they have symptoms, what do we do to help manage that? Do we tell everybody who has some symptoms they need to go to the doctor? Do they tell them they have to stay home and self-quarantine? 

Hilary Wandall: So there's a lot of people trying to figure out what to do and you have this juxtaposition, if you will, of employers standing in the place of having to deal with privacy and safety in ways at a velocity and with a level of intensity that they never have had to before. And they're really trying to get it right. And there's fortunately a lot that we've been able to do at TrustArc to help people understand and navigate through that maze. And we're doing more and more of it day in and day out as new guidance comes out. 

Dave Bittner: You know, it's interesting - I wonder - you know, is it perhaps an unintended consequence of all of this, that we're going through all of this, you know, this traumatic experience of the pandemic, that it has brought attention to these issues and that we can't continue to kick the can down the road. 

Hilary Wandall: It really has brought attention to the issues of making sure that people's privacy is appropriately protected. And I think the one that's most interesting of all, Dave, to me at least, is contact tracing and the issues that erupted when many more Western nations found out some of the practices in South Korea and Israel, just to give a couple of examples, where - in China, too - where the government basically stepped in and said, you know, we're going to monitor who's coming in contact with whom, whether they've been diagnosed. And we're going to take appropriate action to make sure that other people are not infected, some of it to stop or prevent the transmission. 

Hilary Wandall: And the immediate concerns in Western nations, where, you know, there's much more of a focus on individualism and autonomy and privacy rights, et cetera, was, well, is that OK? And do we want the government surveilling us, if you will? This whole concern about, you know, do you have privacy as to where you go and who you interact with and what's going to be done with that data? 

Hilary Wandall: And so it was really interesting to see how quickly both privacy authorities on the one hand - so for example, the European Data Protection Board and the Council of Europe, just to give two examples of big bodies in the EU, as well as tech companies Apple and Google, to give some examples here in the U.S. - came together and said, you know, there's other ways to do this. And they looked at the model that was deployed in Singapore, which used - instead of tracing people's location from a GPS perspective, which has much more long-term ramifications and implications for the data being misused, rather relying on Bluetooth signals when devices come into contact with each other. And that's the model we've seen deployed by Apple and Google that actually is available in those platforms now for apps that are being built on top of it to take advantage. 

Hilary Wandall: And so it's interesting how this - concerns about privacy and who can have information about your location has been brought to bear here in a very different way. I mean, people have raised issues over the last couple of years about online advertisers having access to location data. And what are they doing with that data to then determine whether you should see an ad because you're passing a store - whether that's okay and whether you should have the ability to opt out and how that should be built into devices. 

Hilary Wandall: And so it's really brought these issues to a head in a way that actually is going outside of more the commercial context into much more the day to day of how you as an individual, as a human being are able to protect yourself from being harmed, to protect yourself from being exposed to something that actually can cause death. And, you know, it's caused concerns about how location data can actually be appropriately used across many different aspects of your life as an individual. But can it be done in a way that actually doesn't expose you to more privacy concerns? 

Hilary Wandall: And I think it's really been interesting to see how organizations came together quickly to find a solution here. And It will be interesting to see whether - going back to the earlier part of our conversation. We talked about different business models and the roles that people play to try to find a more balanced way of using data in a way that's responsible. I think the way that Apple and Google, as an example, came up with this approach to contact tracing that was much more responsible way to manage the data - you know, how will that translate into more responsible business models across the board when it comes to balancing privacy with innovation. 

Dave Bittner: All right. Ben, what do you think? 

Ben Yelin: So it was a really interesting interview. I mean, it struck me that she's been in this field for 20 years - since 1999 - and we're still kind of facing the same dilemmas that we faced at that point. Her discussion of contact tracing on COVID-19 was very interesting, as well. I think tech companies have shown appropriate concern about finding the least intrusive methods to enable this type of contact tracing. So switching from - she mentioned location-based methods, which have the potential to lead to private information being divulged, potentially to third parties. We do something instead related to Bluetooth tracking. 

Ben Yelin: So yeah, I mean, I thought, it was a really interesting interview - somebody who's had experience in the field, has been a part of these fights before and is now dealing with some of the most difficult issues that have been invoked by this pandemic. 

Dave Bittner: Yeah. Well, our thanks to Hilary Wandall for joining us. Again, she's the senior vice president of privacy intelligence at TrustArc. And we do appreciate her taking the time for us. That is our show. We want to thank all of you for listening. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening