Caveat 1.27.21
Ep 62 | 1.27.21

The intersection of law, technology and risk.

Transcript

Andrew Burt: The real question we should be asking ourselves is not, like, how do we stand up entirely new bureaucratic systems, but how can we make what we have a lot more effective?

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben looks at the shift to more secure messaging apps in the fallout of Parler going offline. I've got the story of the FTC cracking down on misuse of facial recognition software. And later in the show, my conversation with Andrew Burt from the Yale Information Society Project. He's going to be telling us about the Digital Future Whitepaper Series. And they just released their first white paper. It's titled "Nowhere to Hide: Data, Cyberspace, and the Dangers of the Real (ph) World." 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's dig into some stories this week. What do you have for us? 

Ben Yelin: So my story comes from The Washington Post, written in their technology section by Gerrit De Vynck and Ellen Nakashima. The story is about how right-wing protesters, supporters of President Trump, had been forced to move to encrypted applications because all their other favorite applications have been shut down or significantly curtailed. So not only have Facebook and Twitter tried to cut down on extremists, they've deleted a lot of accounts. Obviously, Facebook temporarily banned President Trump, and Twitter permanently suspended his account. 

Ben Yelin: But even an application like Parler, at least temporarily, went offline. So Parler is a microblogging alternative to Twitter that's very popular among right-wing groups. And it went offline because Amazon web hosting service decided not to support it anymore. 

Dave Bittner: Right. 

Ben Yelin: And it was also taken out of the Apple store and Google Play. Luckily for these groups and for these individuals, there are a bunch of encrypted messaging applications that have filled the void. So the couple that they mention in this article are Telegram. Obviously, that's based in Dubai. It's long been very popular as an encrypted messaging application among people who would rather keep (laughter) their conversations underground, so to speak. And other services like MeWe, which is another social media application that's been very popular among extremists. 

Ben Yelin: So this presents a number of problems. These applications have surged in popularity. They are running up the charts on both the Apple App Store and Google Play, you know, going from the 10,000s into the top 15 applications downloaded. So there's concern there that people who we want to be monitoring are going off the grid. 

Ben Yelin: And when people are using mainstream American social media applications, you know, there are ways for our government and law enforcement to get access via subpoena. Sometimes people leave information public-facing. Law enforcement can get lucky. But law enforcement generally knows how to deal with those types of applications. 

Ben Yelin: With things like Signal and Telegram and MeWe, that means that users are largely going dark and are evading the watchful eye of law enforcement. And one of the reasons that's so concerning is there's been a lot of chatter on these applications about future insurrectionist attacks. 

Ben Yelin: It's been very difficult for law enforcement to break through these encrypted applications, you know, not just because of their security features, but also because the leaders of these companies, I think, are less beholden to political pressure from national leaders to moderate their own content and make user data more available. So it's proving to be a major challenge for law enforcement. 

Dave Bittner: Let me ask you this. Aren't we kind of wanting to have our cake and eat it, too? Because the conversations that you and I typically have about apps like Signal and Telegram - you know, these end-to-end encrypted apps that are built around privacy and security - is that aren't they a great thing for all the people we like (laughter)? 

Ben Yelin: I know. 

Dave Bittner: The folks who - it's a good thing for free speech and people to be able to communicate... 

Ben Yelin: And privacy and - yeah. 

Dave Bittner: ...And keep the government out of their private business. But suddenly, you know, we're on the other side of this now with folks who are - you know, may have had something to do with the riot at the Capitol, you know, folks who are aligning with that. And we - it's really a bright light on the other side of this encryption argument, isn't it? 

Ben Yelin: That's why this issue is so incredibly complicated. I mean, I try and catch myself. I start thinking, well, you know, maybe former Attorney General William Barr was right, that we should prioritize getting a backdoor into these encrypted applications. 

Dave Bittner: (Laughter). 

Ben Yelin: And I'm like, no, this isn't who you are, Ben, you know? 

Dave Bittner: (Laughter). 

Ben Yelin: And I don't think you should judge an application or any policy, for that matter, by its most egregious, disfavored users, you know? So I think we should think twice before we propose rash policy changes in response to this dilemma because we do want to support the right of people to have private end-to-end encrypted conversations on these applications. That does comport with our values. One thing that makes this different is people are using these applications to sow insurrectionist violence. So there is always a line somewhere. 

Dave Bittner: Right. 

Ben Yelin: And even these applications, you know, when they were contacted by The Washington Post, have tried to claim they've done their best to moderate content. They took down 1,000 posts - at least that's what Telegram had said - 3,000 public groups in January alone that were inciting violence, although they said that fewer than 6% of those were in the United States. So even these applications themselves that want to sell themselves on, you know, protecting your privacy, your conversations I think really do draw the line at having their platform be the one where people are planning insurrectionist violence. 

Ben Yelin: But you're right. We can't have our cake and eat it, too. There are always trade-offs here. The trade-off here is would we potentially want to give up all of the benefits of private messaging services - and there are many benefits - just for our short-term goal of protecting ourselves through this very tumultuous time? And honestly, Dave, there's not an easy answer to that question. 

Dave Bittner: Yeah. I mean, I try to think about something that I think everyone is in broad agreement is a bad thing - you know, something like child pornography, that sort of thing - that these platforms can be used for those sorts of things. The people who want to do those things can use these end-to-end encrypted apps in service of those crimes that they're committing. There's not a whole lot that the app developers can do about that because, as you sort of say, I mean, as a side effect of how these apps work, they don't have a view into what's going on on their platforms. 

Dave Bittner: But we all say, well, you know - as you say, we can't judge it by the most extreme uses of it. To me, this is just the developing reality that it makes sense. As you shut down the places in the public eye where folks can communicate - and people learn their lesson, right? 

Ben Yelin: Yes. 

Dave Bittner: Like, the folks who want to do these things are - part of the reason they're getting so easily tracked down and arrested - I'm thinking specifically of the Capitol riot - is because of all the things they posted publicly, so it makes sense. 

Ben Yelin: Right. Look where I am. Yeah. 

Dave Bittner: Right. It makes - look; here's my GPS location. It makes sense that the natural way of things would be that they would say, oh, we've got to do better than that (laughter)... 

Ben Yelin: Yes. 

Dave Bittner: ...And they would move to a different platform. Yes, that is a problem for law enforcement. And what an interesting natural evolution of things - is that a way to say it? 

Ben Yelin: Yeah. You know, one thing I keep thinking about - you know I always try and find my analogies in the nondigital world, but I'm sure there was some point in history where people were planning insurrection through the U.S. mail. People were sending each other letters, saying, you know, here's what I think we should do. This is how we should - you know, we can topple the government, commit violence, sow unrest. And can you imagine if in, you know, response to that, the government decided that they were going to open all of our letters before we saw them? A lot of us would find that offensive. You know, obviously, there's no perfect comparison. There are a lot of differences. But I think that's just something that's important to remember. 

Ben Yelin: But again, it's not an easy question. Somebody asked me, would you do whatever is humanly possible to stop insurrectionist violence and, you know, to prevent more law enforcement officers, innocent people from getting killed? You know, you'd want to say yes, but these are just not easy questions. 

Dave Bittner: Right. And it's that old ticking time bomb issue, right? I don't know. I don't want to say that old chestnut, but, you know, that's often brought up when you have these conversations. You know, what if the bomb is ticking and you have to do something extreme to stop the bomb from ticking? And... 

Ben Yelin: Right. 

Dave Bittner: ...I don't know if it's a fair argument or not. 

Ben Yelin: To be honest, the bomb is very rarely ticking, and that's why I never liked that as a hypothetical. I mean, there are just not that many scenarios where that happens. I kind of feel like this is maybe one where the metaphorical bomb is sort of ticking. And, you know, there have been, over the last couple of weeks, very specific threats of potentially violent armed insurrectionists going to the U.S. Capitol, going to state capitols. So it's not theoretical. I mean, it's real. 

Dave Bittner: Right. 

Ben Yelin: And I just - it's hard for me to try and figure out where to draw the line, whether you can take broad strokes to protect the public against these encrypted applications or whether that would be overbroad and would violate our right to privacy. So it's just a very difficult question, I think. 

Ben Yelin: You know, this article didn't really express a view on that. It was more just kind of summarizing what's happening, which is that these applications have skyrocketed in popularity. But it just kind of gets you thinking about that dilemma. 

Dave Bittner: Yeah. Well, we'll have a link to that Washington Post article in the show notes. 

Dave Bittner: My story this week actually comes from the folks over at the law firm Cooley. We've had some of their fine attorneys as guests on our show here. They sent out an alert that caught my eye, and it's titled "FTC Requires App Developer to Obtain Users' Express Consent for Use of Facial Recognition." Of course, facial recognition is something that comes up regularly with us here. And this is really an interesting outline of what the FTC, the Federal Trade Commission, has been actively doing when it comes to enforcing facial recognition but also kind of laying out where they stand and where things may be going with this. 

Dave Bittner: So this centers around a company called Everalbum. 

Ben Yelin: I would say Everalbum. 

Dave Bittner: Everalbum - thank you. Thank you. 

Ben Yelin: Yeah, my interpretation. 

Dave Bittner: Yes, yes. That is... 

Ben Yelin: I had never heard of it either, so. 

Dave Bittner: That is - OK. Yeah, I think that is the correct way to say it. I just - yeah, that's right because they are a photo storage app, and the app is called Ever. So this is a place where you would store your photos. 

Dave Bittner: Well, this photo app had a facial recognition functionality built into it. And according to the FTC's allegations, this company misstated what they were doing with facial recognition. They were leading their customers to believe that if they opted out of facial recognition that their photos would not be subject to facial recognition when, in fact, they were. 

Ben Yelin: For several years. 

Dave Bittner: For several years. And also, that if someone had deleted their photos that they would be taken out of the facial recognition pool, they wouldn't be used for training the facial recognition software and so on. 

Dave Bittner: And it sounds like this company was doing the common things that folks do when they're training these facial recognition software programs, where you load as many photos as you can into it and you cross-reference that with publicly available photos. So for example, if I were using this app and I loaded up a bunch of photos and I tagged them as being me, they may take all those pictures they have of me and then go out on the general web and do a Google search on my name... 

Ben Yelin: Right. 

Dave Bittner: ...And look for more photos of me and cross-reference those and use those to train their system. 

Dave Bittner: Well, they had been telling their clients that if you opted out or if you deleted your photos that they weren't going to be using your photos. And turns out they actually were. So the FTC took issue with this, and they have a proposed order here, which the commissioners of the FTC voted unanimously to accept. And it's an interesting list of requirements here. 

Dave Bittner: First of all, they want them to provide notice and obtain affirmative express consent before using biometric information in connection with facial recognition technology. That's pretty straightforward. You know, alert people, and make sure it's OK with them. 

Dave Bittner: The second thing is they want them to delete or destroy the photos and videos of deactivated accounts. All right - makes sense. 

Ben Yelin: Fair enough. 

Dave Bittner: This one is interesting. The third one is delete or destroy models or algorithms that Everalbum developed in whole or in part using biometric information that the Ever app collected from its users. Now... 

Ben Yelin: That was the most fascinating to me. 

Dave Bittner: Yeah. To me, this is like going to someone and saying, I need you to remove the sugar from that cake you just baked. 

Ben Yelin: Yes. Yeah. 

Dave Bittner: Right (laughter)? 

Ben Yelin: You kind of have to drown the baby with the bathwater there. 

Dave Bittner: Yeah, yeah. 

Ben Yelin: Yeah. I mean, it reminds me, just speaking from a legal perspective, of the fruit of the poisonous tree. In the legal world, if there was one illegal act from law enforcement, everything that follows from that is considered fruit of the poisonous tree and therefore inadmissible in court. 

Ben Yelin: That kind of seems like what's going on here - is if you created any sort of algorithm based on photos that were improperly stored, then you have to destroy those algorithms. Obviously, that's going to be incredibly burdensome for this company. They've probably been building these algorithms for years. But, you know, the alternative for them is that they would probably have gone out of business. So I would guess that's why they agreed to the settlement. 

Dave Bittner: Yeah. And it seems to me like they'd have to start from scratch, right? 

Ben Yelin: Yeah. 

Dave Bittner: I mean... 

Ben Yelin: Yeah. 

Dave Bittner: Yeah. I guess it's easier to build something the second time than the first time, but still... 

Ben Yelin: It's a rather drastic action. And it shows that the FTC is really taking facial recognition seriously... 

Dave Bittner: Right. 

Ben Yelin: ...Which I think is a good thing. One thing that this article brings up, which I think is something we're going to have to look out for, is now the litigation floodgates are about to open because now that we have this settlement, this is going to apply to all tech platforms, all applications, any sort of web-based service that collects faces and does - and uses facial recognition software or develops facial recognition algorithms. 

Ben Yelin: So this sort of puts other companies on notice that, A, you're going to have to change your terms of service and you're going to have to make sure that your users are able to give express consent to the use of their photos to contribute to these algorithms, and B, if you don't do that, you better hire a law firm. Perhaps Cooley could use this as an advertisement. 

Dave Bittner: (Laughter) Right, right. I suspect that's partially what this alert is intended to be. 

Ben Yelin: Yeah, but get ready for the follow-on litigation. And that seems to be what's happening here. You know, I think this is a - it's really good that the FTC and their Consumer Protection Bureau is focusing on this, focusing on the misuse of facial recognition. And I think this is just the first salvo in what will largely be, you know, probably years of follow-on litigation about this. 

Dave Bittner: There's something in here they mentioned I wanted your take on. They mentioned a coming Supreme Court decision about potentially providing statutory authority to the FTC. Do you have any insights to what's going on there? 

Ben Yelin: Yeah. So this case, which was argued basically last week, concerns whether Section 13(b) of the Federal Trade Commission Act - so the statute that authorized the Federal Trade Commission - can issue injunctions demanding monetary relief, such as restitution, if individuals have been wronged. So depending on the outcome of that case, the FTC potentially could demand restitution - monetary restitution or damages to the users of this application whose photos were improperly stored. 

Ben Yelin: So I think people who were using this app - using Everalbum would want to pay attention to this FTC case because they might have a cause of action where they could obtain monetary damages. I didn't pay attention to the oral argument, so I don't really have an indication on how this case is going to be decided. But it was just argued in the past week, so it's a case that's going to be decided over the next several months. 

Dave Bittner: Yeah. It's interesting to see how much is going on in the facial recognition space. You know, we've got different states that have different standards here, and people get caught up in those things. Illinois, for example, has a very stringent set of rules when it comes to biometrics. So if you're - I suppose if you're doing business coast to coast, you need to pay attention to those individual state requirements, right? 

Ben Yelin: Yeah. I mean, it's just like how we always talk about companies have to comply with the most stringent state regulations because, you know, you don't want to adopt different business practices for all 50 states. So it ends up being sort of a race to the top, and Illinois is at the top as it relates to facial recognition. And kudos to them for being out in front on this issue. 

Dave Bittner: Yeah. All right. Well, we will have a link to this post in the show notes - interesting read. It really lays everything out here, has some recommendations for companies. So appreciative to the folks over at Cooley for putting all this in plain language that just about anybody can understand. 

Dave Bittner: All right. Well, those are our stories for this week. Of course, we would love to hear from you. We have a call-in number. It's 410-618-3720. You can call and leave us a message, and we may use it on the air. You can also send us email to caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had a conversation with Andrew Burt. He's from the Yale Information Society Project, and they have a Digital Future White Paper Series. They just published their first white paper. It's titled "Nowhere to Hide: Data, Cyberspace, and the Dangers of the Digital World." Here's my conversation with Andrew Burt. 

Andrew Burt: I have been working at the intersection of law, technology and risk for quite some time, first in the FBI Cyber Division, then at Immuta, which came out of the intelligence community, and now at bnh.ai, this boutique law firm focused on AI. And throughout all of my experience, it's just become clear that we as policymakers, as lawyers, as consumers, as technologists do not effectively manage risk in cyberspace, period. We see it every day - headlines and breaches, new privacy laws, new laws focused on AI. It just feels like we are drowning in a sea of digital technologies, and we don't really know what to grab, how we need to help ourselves. 

Andrew Burt: And so this essay is really just kind of a culmination of years of working on some of these issues and thinking about these issues and basically just trying to describe the issues as I see it, which is, not to be a pessimist, but things are pretty bad. We really, I think, have lost sight of what it is we are trying to protect when we talk about things like privacy and security and responsible AI. 

Dave Bittner: Yeah. You know, as you say, not to be a pessimist, but, you know, the first several sections of your white paper here are Nowhere to Hide, Privacy is Dead, Trust is Dead and You're Not Who You Think You Are. Not starting off with a rosy summary of where things stand. 

Andrew Burt: Yeah, no, it's not - it is not rosy. And some of the reactions - so far, folks seem to enjoy the white paper, but the reactions are kind of like, this is very scary. And I don't think I wrote this to scare people. I think I wrote this to, honestly, just try to honestly describe my thoughts and my reactions to what's going on. 

Andrew Burt: And I think despite that kind of serving of just blunt kind of candor, an assessment of where things stand regarding our privacy and our security, which in my view I think is accurate, I am optimistic. You know, I do close out this white paper by saying there are many things we can do to fix these issues. So it's not all doom and gloom. I think I say somewhere in the paper that the sky may seem like it is falling, but it need not fall as fast or land as hard. 

Andrew Burt: And so I think the real takeaway is that none of this is fated to happen. It is not a foregone conclusion that privacy, you know, really ends up being something meaningless that, you know, our grandparents used to enjoy and that, you know, users of social media and denizens of the internet just kind of look back at this idea of privacy kind of laughingly as something that's arcane. This is not something that's fated to happen. 

Andrew Burt: And so I think there's a good deal of optimism here in the sense that, like, we have a choice. We can actually choose to adopt digital technologies responsibly, to slow down our adoption so that we can actually understand the complexity of the digital world. But I think, you know, that being said, if the future looks like the past, it's not going to be a particularly great future in terms of our ability to protect all the data that we generate. 

Dave Bittner: Well, take me through your premise here. I mean, if privacy and trust are indeed dead, what leads you to that conclusion? 

Andrew Burt: I'll start with privacy. And basically, privacy is a very kind of ambiguous concept that has shifted over the decades. But I kind of anchor my definition of privacy in the 1890 very seminal, famous law review article by future Justice Brandeis and another attorney that really kind of set the definition for privacy as the right to be left alone. 

Andrew Burt: And so if we think about what this means, the right to be left alone, in our kind of modern era, where everything we do generates data and data at large volumes generates insights that we can't understand that are often sensitive and reflect about who we are, there's really no other conclusion than to state, I think pretty bluntly, again, that privacy as we've known it, privacy as we've conceived of it is no longer. The ways that we can be identified, the ways deep and sensitive insights about us can be inferred from the data we generate just keeps growing. 

Andrew Burt: I cite in the white paper an incident that happened almost 10 years ago where Target very famously was able to predict that a teenager was pregnant based on her shopping patterns before her family even knew. And that was a decade ago. You know, that was just a few years after the adoption of the iPhone. That was not taking into account all the data that we now generate on a daily basis. And it wasn't also even taking into account the fact that so many of us now live almost our entire lives online as a result of the pandemic. 

Andrew Burt: On privacy kind of specifically, this idea that we can protect sensitive information about us, this idea that we can actually exert some control over what other people are able to learn is becoming more and more meaningless directly in relation to all the data we keep generating. So the more digital devices we have, the more connected we are, the more data we generate, the harder it's going to be to protect what we think about as privacy. 

Dave Bittner: Do you suppose there's a certain sense of resignation out there where people sort of throw their hands up and they say, well, you know, I want to be part of this digital world, and this is just part of what goes with that? 

Andrew Burt: Certainly. But I don't think that's the whole story because if you look at polls and you look at just in general the way the public in the U.S. and around the world views privacy, this is a really important issue. In fact, it's one of the few issues right now that actually has bipartisan support, which is almost - it's almost crazy to think that there's any issue that could kind of unite lawmakers on both sides of the aisle. But privacy is one of them. 

Andrew Burt: So I think there certainly is some resignation. But I think the broader issue is I think it's just about incentive structures. And right now, the incentive structures related to almost all digital technology just overvalue short-term thinking. And so it's one of the reasons why, you know, over time, it becomes incredibly difficult to understand, little less manage IT environments in a corporate setting, in a home setting. As all of these choices add up, as they accumulate in a creed, it becomes very hard to understand exactly what it is that's happening. 

Andrew Burt: And so when we're focused on, what's the next app I'm going to download, what's the next cellphone I'm going to buy, what's the next IoT, you know, device I'm going to connect to my home network, it just becomes very hard to step back and think about the bigger picture. 

Andrew Burt: This is certainly something that I don't think it's actually rocket science. In the white paper, I outline a couple of very concrete things that I think we can do to help us make sense of our environment. And again, I mean, this is totally possible. This is not a foregone conclusion that we are just unable to protect our data. 

Andrew Burt: And I would also just say one of - the pretty interesting feedback I've been getting, and frankly, I've been getting this ever since I left the FBI, where people come up to me and they say, you know, what do you think about the latest hack? And kind of it's almost - you know, let's say every two to three weeks it happens. And there's always kind of a tone that, like, this is the worst thing that's ever happened, you know? 

Dave Bittner: Right. 

Andrew Burt: What do you think about this? And it's kind of we're just watching, you know, the tide rising, and every single time it rises, you know, it's higher than before. I think one of the interesting perspectives or - I don't know - one of the takeaways of a lot of those conversations is at the beginning, it comes off as pessimistic because I kind of say, of course, this is a real issue. And we as a society, we as a government, globally, we are just not doing enough to address this. But on the other hand, it is not that complicated to fix. We have fixed far more complex issues as a society. 

Andrew Burt: So, again, I think what it really - the real issue is one of just aligning incentives so that the folks who are selling digital technologies have the same incentives as the folks who are buying and using, and the folks who are using it also have a good understanding of what the risks they're actually kind of adopting as they adopt all these technologies. 

Dave Bittner: Well, and let's go through some of the specifics together. What are some of the things that you consider to be achievable here? 

Andrew Burt: I'm a lawyer by training, so a lot of these are actually kind of - some of them are technological, but a lot of them just kind of start with, what should laws do? How can laws create the right incentives? 

Andrew Burt: And so one of them that I cite - well, there are a couple, and I don't know how deep we want to get, but one of them that I'll just say is I talk about how laws should incentivize the use of things like privacy-enhancing technologies. And so these are actual technologies or techniques that can help obfuscate data, that can help increase the level of security and privacy associated with data use. And right now, you know, there's a wonderful U.N. report on privacy-enhancing technologies that I believe I cite in the paper. But right now, they're kind of like - they're thought of as quirky and interesting. You know, the tech giants - Microsoft, Apple, Google - use things like differential privacy, but they're not yet widespread. 

Andrew Burt: And so one of the suggestions is to have policymakers actually incentivize the use of privacy-enhancing technologies by saying things like, if you're using a technique like differential privacy or k-anonymization or federated learning and there is a breach or there is some type of failure, liability is mitigated. There will be less - there'll be less liability, what we will incur on you, lower fines. And so just the very act of doing that is going to incentivize organizations to adopt technologies that actually prioritize privacy and security. And also, I think it'll have the added benefit of reducing insurance rates, which will further incentivize the adoption. 

Andrew Burt: And these are - basically, like, the way that I think of these privacy-enhancing technologies as kind of technical safeguards, technical guardrails built into the very fabric of our software that can help to minimize some of the risks that are, frankly, just inevitable. I'm happy to keep talking about some of the suggestions, but I don't know how deep we want to go into the weeds here. 

Dave Bittner: Well, you know, there's something that strikes me here that it seems to me there's a kind of a fundamental disproportionality here. And I think of the example of if I'm just going around, you know, checking out websites, you know, news websites and so on, and I have an ad blocker on because I don't want all my information tracked, and, you know, the website will pop up, and they'll say, hey, we noticed you're using an ad blocker. You know, please let us show our ads. 

Dave Bittner: And the thing is, like, I don't have a problem with them showing me ads. That's fine. I understand. That's a great part of commerce. What bugs me is all of the tracking that goes along with it. And we seem to be, in many cases, in this all-or-nothing sort of thing. There's really no practical way for me to say, sure, show me your ads, but don't track me, you know? 

Dave Bittner: And one of the points you make in your white paper here is we can't consent to what we don't understand. And I think we're in this world with these ridiculous EULAs that nobody reads, nobody understands. And so from the consumer point of view, I wonder again, you know, that point about - that I made about throwing your hands up, how is an individual supposed to try to get control of these sorts of things? 

Andrew Burt: Yeah. I mean, this is a collective issue. This is not something that we can solve on our own. This is a political issue. And so we can't. We just can't. And, indeed, as AI is adopted more and more, the entire value of things like AI, of techniques like AI and machine learning is that they will extract from data insights that we can't predict and, in some cases, insights we can't understand. 

Andrew Burt: So the very concept that, you know, I as a consumer is going to generate data, and I'm going to be able to understand the value of that data, you know, as I'm generating it or, frankly, ever is just ridiculous. It is not aligned to the current reality of what it actually means to generate data and what it actually means for organizations to extract value out of that. So the idea of consent, the idea of all of these very kind of detailed privacy policies we're supposed to understand is just not practical. 

Andrew Burt: And so one of the things I say in the white paper is that we need to move on. We need to come to a better way of protecting all the data we generate without actually just kind of delegating the decision to users that are just not equipped to understand the value of the data that they're actually handing over. 

Dave Bittner: What about this notion - you know, Ben and I have batted around this idea of coming at this similarly to a public health approach, which is that, you know, for example, do we need something equivalent to the FDA for social media algorithms? You know, before you turn loose this algorithm on the general public, you must prove that it first will do no harm, you know, in the same way that a prescription medication, you know, has to be tested and vetted, you know, before it's turned loose. 

Dave Bittner: I mean, is there anything in your mind to that approach? Is there anything there? 

Andrew Burt: I would make two points. The first is that a lot of the discussion about, you know, the current state of privacy and data protection, it always kind of gravitates towards social media. I would just - the first point I would make is this is about software writ large. This is about, like, how it is that we interact with software systems and the digital world. So certainly, social media plays a huge part, but I think it's much broader. Like, when I'm thinking of these issues, I'm also thinking of smart doorknobs and smart toasters. And, you know, so that'd be my first point. 

Andrew Burt: My second point is I've seen a lot of this, this idea that, like, we have a problem; let's create a new government agency solely focused on it to fix it. I get why folks make that argument, but it doesn't sit well with me. In fact, as early as - I think it was the early 1970s, some senators were actually suggesting, like, a federal department of computers. And we didn't, of course, stand up something like that. And I think, you know, we have a huge amount of innovation to show for it. 

Andrew Burt: So I personally much prefer a regulatory approach that's much more rooted in the U.S., which is basically you let the sectors - the FDA for things that are related to medicine or, you know, the National Highway Traffic Safety Administration for things related to automotive vehicles - I much prefer that type of framework, where there are multiple regulators playing in the same space, just because I think it can get a little bit more granular and more effective. So I'm not someone who supports the idea of, like, let's stand up an entirely new government agency. 

Andrew Burt: And one kind of additional point I would make is we kind of already have that agency with the FTC. I think the FTC and some supporters and folks on Capitol Hill have been floating around ideas that would give the FTC increased authority over many of these issues. 

Andrew Burt: So I'm someone who wants our bureaucracy to be as efficient as possible. Again, I'm trying not to be too pessimistic. That's a very optimistic thing to say, I think, in this day and age. 

Dave Bittner: Yeah. 

Andrew Burt: And so I think the real question we should be asking ourselves is not, like, how do we stand up entirely new bureaucratic systems, but how can we make what we have a lot more effective? And I think as a testament to the fact that we don't need a new - entirely new government agency - like, there is just so much more we can do. And again, I make some of those recommendations in the white paper. 

Dave Bittner: Do you suppose that there is political will to see something like this through? I mean, as you and I are recording this, we're heading into a new administration in Washington. Do you suppose this is something that could bubble up to the top and actually see real action? 

Andrew Burt: Yeah, 100%. I would be very, very surprised if by the end of the next administration, there wasn't some type of major legislation dealing with these issues. My only fear is that it'll get it wrong. I think it's very easy to kind of in the modern - very - I don't know - I was going to say polarized, but that doesn't even do it. But in the modern political era, it's very easy to reduce kind of complex issues to statements that don't do the actual issues justice. 

Andrew Burt: And so my concern - so I'm very confident something is going to pass. My concern is it's going to be like, let's break up Google, let's break up Facebook, let's take big, bold, you know, actions that look good in the headlines, and then we're still going to be left with smart doorknobs that, you know, don't - whose software systems can't update and can, therefore, just kind of, like, sit connected to the internet and create vulnerabilities that many of us are not even aware of. 

Andrew Burt: So I think my major - the major thing that I worry about is just kind of the quality of software and the complexity of this kind of really, really rapidly expanding digital environment. And if there are no repercussions for creating software that has lots of vulnerabilities, and if we have no real incentives to tamp down that complexity and understand it, it almost doesn't matter what we do to, you know, the big tech companies. We're still going to have issues protecting our data. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: Oh, that was very dark. I mean, he's painting a picture of a presence in a future where we've permanently lost our sense of privacy. And I think that's something that most people just don't recognize or realize, which is good that he's doing this series of white papers. 

Ben Yelin: I think the first step in addressing these problems is realizing that they exist, is acknowledging that in our pursuit of enhanced digital technology that's improved all of our lives, we have given up any last semblance of personal privacy. There are security cameras everywhere. There's facial recognition. There's, you know, cell site location information. You know, when you put all of those things together, it's not the way it was 50 years ago, where you could really rely on being in your own home and being private from the outside world. 

Ben Yelin: So in that sense, you know, it's sort of dark. But I think he did give some indication as to what policy changes we can make that would cut against this loss of privacy. And so much of it has to do with user consent just so that, you know, people who sign these EULAs can be aware of what they're actually signing. 

Dave Bittner: Yeah. 

Ben Yelin: So I thought it was a really interesting conversation. 

Dave Bittner: Yeah, thanks. You know, it strikes me that we've kind of - as you say, over the past 40, 50 years or so, it's been this slow - you know, there's that analogy of boiling the frog... 

Ben Yelin: Right. 

Dave Bittner: ...You know, where the temperature just slowly comes up. 

Ben Yelin: Slowly boils. 

Dave Bittner: Right, right. And the frog doesn't jump out of the boiling water because it's not like you toss him in and he jumps out. No, you bring the temperature up slowly. And I think when it comes to chipping away at a lot of these privacy things that we just come to expect as part of citizenry, all these technologies have chipped away at the edges of them. And now we find ourselves going, wait. Hold on. Wait a minute. 

Ben Yelin: What did I give up? Yeah. 

Dave Bittner: Right, right. Now we're realizing (laughter). 

Ben Yelin: It's not like there was one day where there was a decree from up high, you know, handed down on tablets saying, you no longer have privacy. 

Dave Bittner: Right. 

Ben Yelin: I mean, it's been a slow and steady process... 

Dave Bittner: Right. 

Ben Yelin: ...As new forms of technology have been introduced. And as we mentioned a million times, it's taken a very long time for the court system to catch up with those technologies. 

Dave Bittner: Yeah. It strikes me, too, how it's been - so often been sort of unintended consequences where it's like, hey, good news. Your phone has GPS in it. Oh, that's great. I can use this app to help me navigate my car or, you know, find my way to the nearest taco truck or something. 

Ben Yelin: Yeah. 

Dave Bittner: But the flip side is... 

Ben Yelin: Also, Panera knows where I am at all times. 

Dave Bittner: (Laughter) Right, right. Exactly, exactly. All right, well, our thanks to Andrew Burt for joining us. Again, we'll have a link to their work, to that white paper in the show notes. 

Dave Bittner: That is our show. We want to thank all of you for listening. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.