Caveat 12.9.21
Ep 104 | 12.9.21

Maximizing our power to do good online while minimizing risks.

Transcript

Susan Liautaud: Online, we have an awful lot more power both to do good and to do harm. And the real challenge in the ethics is to figure out how we can maximize our power to do good, but also to minimize the risks.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin, from the University of Maryland's Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses a criminal case dealing with expectations of privacy and social media posts. I look at a controversial family safety app that some say overshares location data. And later in the show, author, professor and consultant Dr. Susan Liautaud on technology and ethics from social media to the sharing economy. While the show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please, contact your attorney. All right. Ben, we have got a full show today, lots to cover. Why don't you kick things off today with your story? 

Ben Yelin: So I'm going on a little theme here from our last show. And this is the always enjoyable theme of stupid criminals. 

Dave Bittner: (Laughter). 

Ben Yelin: The lesson here is don't post evidence of your crimes on popular social media sites. 

Dave Bittner: Go on. 

Ben Yelin: So this is a case from the state of Delaware, Delaware v. Briscoe alerted to me by, of course, Professor Orin Kerr on Twitter - long standing invitation for him to come on the show. 

Dave Bittner: (Laughter) That's right. 

Ben Yelin: But it's sort of, you know, one of those things that I'll just hope for for generations. And it might never happen. 

Dave Bittner: Yeah. 

Ben Yelin: But in this case, the defendant was a previously convicted felon. And supposedly, either a confidential informant or a post on Facebook alerted law enforcement that Mr. Briscoe was at a particular location in Wilmington and in possession of drugs and a handgun. And that's what's in dispute is whether it was the informant or whether it was Facebook/Instagram that alerted law enforcement that this is where he was. Law enforcement comes. They spot him in his car. They do a search of that car. They discover some drugs. They also find a separate car key on him and realize that he has a stash car where he's holding other drugs and guns. 

Dave Bittner: What's a stash car? 

Ben Yelin: That's a car where you - it's the car that you don't use but where you keep all of your illegal substances/items. 

Dave Bittner: I see. 

Ben Yelin: So it's something, you know, where you can hold your illegal guns and/or drugs. Not saying any of this from personal experience just... 

Dave Bittner: (Laughter). 

Ben Yelin: ...From what I read about and watch on the crime shows. So... 

Dave Bittner: OK. So not a car you're driving around in. But this is a secondary vehicle that you use, basically, as a storage locker? 

Ben Yelin: Exactly. Exactly. 

Dave Bittner: OK. 

Ben Yelin: So they were able to figure out that that separate car was on the same block. And with that key they found on the defendant, they were able to unlock that car, found a bunch of illegal weapons and drugs, took Mr. Briscoe to the police station. They searched his body and found cocaine. He made a hilarious quote, which is quoted in this case - "y'all finally got me," which is like... 

Dave Bittner: (Laughter) He said to the officers, y'all finally got me? 

Ben Yelin: Yeah. 

Dave Bittner: All right. 

Ben Yelin: You're not really making things too difficult here, Mr. Briscoe. 

Dave Bittner: (Laughter) OK. 

Ben Yelin: So the nature of the dispute is how law enforcement knew where he was. Law enforcement says that a confidential informant sent over a Facebook post allegedly from Mr. Briscoe, which would identify that he was sitting in a particular car at a particular time. And the informant knew where Mr. Briscoe liked to sit in his car. Briscoe claims that law enforcement were snooping in on his social media accounts because he was a past criminal. They were, you know, spying on him, trying to figure out where he was, what he was doing, so that they could catch him. What the court says here is it doesn't really matter whether the police were snooping on Mr. Briscoe or whether it was the confidential informant. Anything you share on Facebook, you forfeited your reasonable expectation of privacy in that information, especially if you don't control who gets to see that post. 

Ben Yelin: So there was a previous case in Delaware called the Everett case that was an interesting precedent case. And that was another, you know, social media post case. And the court in Everett said that if somebody friends you and it turns out to be a law enforcement source, an undercover cop - if you accept that friend request and that person can see your information, you have no reasonable expectation of privacy. You should assume that if you're accepting friend requests and posting things online that you're leaving it open that law enforcement is going to be able to get access to that information. And that's the case here. Whether it was the confidential informant who saw this Facebook post, or whether law enforcement itself were just perusing through Mr. Briscoe's Instagram and Facebook accounts... 

Dave Bittner: (Laughter) As you do. 

Ben Yelin: As one does. 

Dave Bittner: (Laughter). 

Ben Yelin: The fact that he posted these publicly and didn't, you know, manipulate the privacy settings to prevent people from seeing it, didn't exhibit that subjective expectation of privacy, means you can't suppress this evidence. He has no Fourth Amendment interest in this evidence. The lesson here is don't post evidence of your crimes on Facebook unless you've changed your privacy settings. Now, even if you do change your privacy settings, there's still going to be some dangers there. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: You know, if you limit it to just your friends, some of your friends very well might be law enforcement sources. But, you know, especially if your posts are public or, you know, if they're just available to your friends or people you've accepted friend requests from, you have no reasonable expectation of privacy in that information, even if that information is particularly revealing. So one thing that's interesting here is Briscoe tried to bring in Carpenter saying, well, if you are gleaning information from social media, you get a detailed picture of my life. You could track my location over time, you can track what I do, who I see, who I have relations with, et cetera. What the court says here is that collection was largely involuntary. You'll recall in Carpenter, that had to do with cell site location information. That's collected by simply turning on your device because it's going to ping cellphone towers. That - that's involuntary. 

Ben Yelin: What Briscoe did here is extremely voluntary. He just posted it on Facebook and Instagram. He knew full well that a certain subset of people were going to be able to view those posts and that information, and he did nothing to try to conceal it. So whether it was the informant or law enforcement who saw it is really immaterial. The fact that he posted it means he has no expectation of privacy in that information. 

Dave Bittner: And so no warrant necessary. 

Ben Yelin: No, there is no warrant necessary. You don't have to get any prior approval from a judge. You know, when it comes to confidential informants, there's a large - a well-accepted constitutional standard as to whether you should trust confidential informants, and it goes to the totality of the circumstances - comes from a Supreme Court case, Illinois v. Gates. Basically, if there's a reason to believe, based on the history of this informant - his or her reliability, his or her relationships with law enforcement - then, you know, a court can independently assess the reliability of that informant, and that informant can be used as reliable evidence. And that seems to be what the court is saying here, that they trust this confidential informant enough to believe that he was the one who identified this picture on social media and he was the one who sent it to law enforcement. But again, that's largely immaterial, because even if law enforcement were the first people who saw this just by perusing Mr. Briscoe's Facebook page, he still wouldn't have any expectation of privacy in that information. 

Dave Bittner: When someone is making use of a confidential informant in a situation like this, do they have to reveal to the judge who that informant is? 

Ben Yelin: Absolutely not. You know, they want to protect that person's identity. Now, if there's a question on that person's reliability, there might be some sort of ex parte hearing, you know, without the parties involved where they will discuss the reliability of that informant. But generally, they do not have to reveal that information to the court. 

Dave Bittner: I guess what I'm getting at is, does this allow a loophole for law enforcement? I guess law enforcement is under oath when they're testifying to a judge, for example. But I'm thinking of the - I'm being cynical here, and I'm thinking of the possible loophole... 

Ben Yelin: You? Never. 

Dave Bittner: (Laughter) I know. I know. Us? On this show? (Laughter) Of law enforcement, you know, saying that they have a confidential informant when it's really, you know, Bob down the hall (laughter) or the - Bob who's on the desk... 

Ben Yelin: The deputy, yeah. 

Dave Bittner: Yeah, who's the desk across from mine. You see where I'm going with this, right? 

Ben Yelin: Yeah, I mean, you have to include some information in an affidavit about the reliability of the informants. 

Dave Bittner: I see. 

Ben Yelin: So you don't have to reveal who he is personally, but you have to attest that this informant has relevant personal knowledge. You know, you've worked with them in the past. They've identified previous criminals. You know that they have a relationship with the defendants. You have to assert all of that under oath on the record. Now... 

Dave Bittner: OK. 

Ben Yelin: ...Has it been abused? It's - most certainly has. Probably, you know, more times than you can count on, you know, two hands, for sure. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: But, you know, that still doesn't cut away at the process where you just have to assert under oath the reliability of the confidential informant. 

Dave Bittner: I see. All right, well, interesting for sure. So I guess the bottom-line take-home here is don't post evidence of your crimes online (laughter). 

Ben Yelin: Yeah. Don't post a picture of yourself holding a gun, no matter how cool you look, if you're a previously convicted felon. 

Dave Bittner: Yeah (laughter). Right. 

Ben Yelin: Same goes for drug paraphernalia. And if you are going to do that, just do a better job of trying to conceal your Facebook posts... 

Dave Bittner: Yeah. 

Ben Yelin: ...Or whichever social media site you're using. 

Dave Bittner: Right. 

Ben Yelin: Yeah. 

Dave Bittner: Up your privacy settings. 

Ben Yelin: Absolutely. 

Dave Bittner: (Laughter) OK, fair enough. 

Dave Bittner: All right. Well, my story this week comes from The Markup. This is an article written by Jon Keegan and Alfred Ng, and it's titled "The Popular Family Safety App Life360 Is Selling Precise Location Data on Its Tens of Millions of Users." So the upshot of this is there is an app called Life360, and this is an app that sells itself by saying that it is for family safety. So for example, let's say you have a couple of kids, which - you and I both have a couple of kids. 

Ben Yelin: We sure do. 

Dave Bittner: And you would feel better knowing where those kids are and what they're up to. So you can have your kids install this app, and it will use GPS to let you know where they are. 

Ben Yelin: That both sounds good but also should raise alarm bells as soon as you hear it. 

Dave Bittner: Yes. 

Ben Yelin: Yeah. 

Dave Bittner: In addition to that, it has some features, for example, like crash detection. So if your kid is in a car accident, it will alert you immediately that a high G-force event has happened, for example, right? 

Dave Bittner: Now, just stepping aside here for a second, there's a part of me that wonders if this is at all helpful. Like, what are you going to do? You know, you get a notice like this. I guess you could be the first to call 911. I guess, you know... 

Ben Yelin: Yeah. I mean, if your kid's driving, you know, presumably, they are - I mean, I guess they could be seriously injured. But presumably... 

Dave Bittner: Yeah. 

Ben Yelin: ...They'd understand how to call 911 from their cellphone. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: I guess my point is that I wonder - there's a part of me that wonders if apps like this, all they do is increase people's anxiety to be able... 

Ben Yelin: Probably, yeah. 

Dave Bittner: You know, like, is there an ignorance is bliss element here, especially when you're talking about teenage kids, for example? 

Ben Yelin: Yeah. Now, you have experience with this. My kids are younger. 

Dave Bittner: (Laughter) Yeah. 

Ben Yelin: But I would guess that sometimes you'd rather just not know where your kids are. 

Dave Bittner: Yes. 

Ben Yelin: Tell them, don't - you know, don't do drugs. Don't do anything illegal. 

Dave Bittner: Right. 

Ben Yelin: Get home safely. But otherwise, you know, I'd... 

Dave Bittner: Yeah. 

Ben Yelin: ...I'd rather just not know. 

Dave Bittner: Yeah. And this is actually an area where my spouse and I don't always see eye to eye on. She wants to know where they are all the time, and I'm like, eh. You know, come home when the streetlights come on. So, like... 

Ben Yelin: Yeah, exactly. 

Dave Bittner: I'm more free-range than she is, but, you know, it's just - we're just different in that way. 

Ben Yelin: Right. 

Dave Bittner: Now, you know, there are other versions of this. For example, Apple has a version built into their iOS environment, their operating system there, where you can opt in to location data sharing for families and so on and so forth. But what is at issue here is that Life360 is selling this location data to a lot of different location data brokers. There's a company called Cuebiq and X-Mode, which buy and sell this sort of information. And this information, once it's out there on the market, it's out there on the market. 

Ben Yelin: Yup. 

Dave Bittner: And all sorts of different interests... 

Ben Yelin: Sure. 

Dave Bittner: ...Can vacuum up this data and buy it. 

Ben Yelin: And it's pretty darn revealing data, I mean, if you think about it. When we're talking about location data, we've said this a million times, but that can really reveal some personal, private relationships, you know, things that your kids are doing that you would not want the broader world to know about. 

Dave Bittner: Right. 

Ben Yelin: I understand why it's profitable, but this is also information that's potentially very personal. 

Dave Bittner: Yeah. Well, speaking of profitable, this article points out that the Life360 folks - in 2020, they made $16 million, which is nearly 20% of their revenue, from selling this location data. 

Ben Yelin: Hey, they got to eat, right? 

Dave Bittner: (Laughter) Well, I mean, I think that really indicates how valuable this is. I mean, 20% of your revenue - that's a lot. That's a huge part of how you're making the company work, right? 

Ben Yelin: Yeah. I mean, I think what they would say, and I agree with them, is, we want to provide this useful application to you for free. So we could charge user fees. This could be a paid application. You know, we could give you a free version, but you'd have to pay for various add-ons or whatever. 

Dave Bittner: Right. 

Ben Yelin: We're going to make it free, but that means, you know, we need a way to make money. So you can do that in a number of ways. One way is just through advertising, which is fine, but it can also be - it can make your apps less user friendly if ads are popping up all the time, especially, you know, when you're trying to monitor your kids if you have to X out of a bunch of screens... 

Dave Bittner: Right (laughter). 

Ben Yelin: Yeah. 

Dave Bittner: Right. 

Ben Yelin: ...About, you know, Bud Light's new Christmas beer. But the other way they can make money is selling your data. The data itself is extremely valuable. When we're talking about data brokers, they can sell it to businesses who can microtarget, you know, individuals and families for advertising. And as, you know, we've said, there's nothing more valuable than that - getting access to that data. 

Ben Yelin: So I think what the company would say - and they have this hilarious, we can neither confirm nor deny this report quote - is they have a business model where in order to keep our application free to our users, we have to - you know, we have to make money somehow. 

Dave Bittner: Right. 

Ben Yelin: And one of the ways we're going to make money, 20% of our revenue, is through selling this data. They're insisting that it stays anonymized. We know that that's not always the case. 

Dave Bittner: Yup. 

Ben Yelin: You know, I just wonder - I think people should be fully aware that their data is being sold in this way, and then they can make the decision as to whether it's worth it, right? 

Dave Bittner: Right. 

Ben Yelin: The problem is that this is all very concealed from the user. You know, it's part of the EULA. It's part of the terms and services. They found another cyber expert besides us in this article who is willing to tell them that nobody actually reads the terms and services. 

Dave Bittner: Right. 

Ben Yelin: People didn't see it. So, you know, you're not providing any type of meaningful consent here. 

Dave Bittner: Yeah. Well, and the folks at Life360 remind us that there is an option to opt out in the app, which, of course, in my opinion, is the opposite of what should happen. It should be opt-in. 

Ben Yelin: Yup. 

Dave Bittner: But, of course, that's my fantasy. That's never going to happen, not without - that's never going to happen without any regulation. 

Ben Yelin: Right, right. 

Dave Bittner: And I guess that points to the bigger issue here, which is that, you know, this article points out that some of the usual suspects - Senator Wyden, for example - have their eye on this sort of thing. And I wonder, could we see regulation? Could we and should we see regulation that makes these sort of location-sharing agreements opt-in only? 

Ben Yelin: I mean, the could and the should, I think, are very separate questions. 

Dave Bittner: Yeah. 

Ben Yelin: I think it should. And I think there are advocates on both sides of the political aisle, like Senator Wyden, and a lot of advocates on the Republican, conservative side who would say, we need to protect people's interest in this information. They - these types of applications should not be able to sell your data. Or at the very least, you should have to opt in for them to do so. Maybe there is a very clear warning when you download the application that says, we want permission to sell your data to these data brokers. 

Dave Bittner: Right. 

Ben Yelin: You know, please, press OK. You have the option of opting out - something like that where the consumer has meaningful choice. That's something that Congress could do. They could regulate it. There are reasons that I think it won't happen in the near future, largely, you know, just Congress never really does anything. 

Dave Bittner: (Laughter). 

Ben Yelin: That seems to be a prominent theme of - when we ask, is Congress going to do something? The easy answer is always probably not. 

Dave Bittner: Yeah. 

Ben Yelin: But also, you know, not to be cynical - I guess we both get to be cynical in this episode. 

Dave Bittner: (Laughter). 

Ben Yelin: But there is a lot of money involved here. 

Dave Bittner: Right. 

Ben Yelin: And these applications can go to members of Congress and say, fine, you can regulate us like this. But then we're going to have to charge for the use of our applications. And the general public is not going to be happy about that. They're not going to want to pay $3 a month for this application when they've been using it for free. 

Dave Bittner: Yeah. Well, but also, I think it brings up the point that privacy should not be a line between the haves and the have-nots. In other words, you know, if you - there's an argument to be made that if you pay your $3 a month or your $10 a month or your $100 a month - whatever it is - that that would be one way to get out of having your information tracked. 

Ben Yelin: Right. 

Dave Bittner: But the other side of that argument is, you shouldn't have to be a person of means... 

Ben Yelin: Absolutely. 

Dave Bittner: ...To enjoy privacy. 

Ben Yelin: Absolutely. That's always something that concerns me is the only way that we can have digital privacy is to have money, to pay our way out of these types of digital tracking tools. And I think that's fundamentally unfair and goes against our values. Privacy should not be considered a commodity. It is, as of today, implicitly a constitutional right, the right of privacy. 

Dave Bittner: Yeah. 

Ben Yelin: And, you know - so I think it's more important than something that people should just be able to buy their way out of. So I think it's really good that articles come out about this - you know, it's always going to raise the ire of people like Senator Wyden - but just for the general public to understand that this is what's happening on these applications. 

Dave Bittner: Right. 

Ben Yelin: When you have free applications that track you across various locations, there is no free lunch. There's a reason it's free. It probably has to do with the fact that they are selling your data to data brokers. 

Dave Bittner: This article actually has a helpful little step-by-step guide for where to turn off selling your personal information. So - (laughter). 

Ben Yelin: Yeah, complete with pictures... 

Dave Bittner: Right. 

Ben Yelin: ...Which is good for me. I generally need to see it until... 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: Yeah. Yeah. So we will have a link to this article in the show notes - definitely worth a read and sharing with your friends, families and loved ones just to make sure that they're aware of what is going on here. We would love to hear from you. If you have a topic you would like for us to cover here on the show, you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Dr. Susan Liautaud. She is a leading international expert on ethics. She is the author of the book "The Power of Ethics." And she is the founder of the Ethics Incubator. She teaches at Stanford University. Really interesting conversation here. Here's my conversation with Susan Liautaud. 

Susan Liautaud: So the first thing is that anything we think about with ethics - when technology comes into the mix, the risks are turbocharged. So online, that could be anything from - what I talk about in the book is contagion of unethical behavior. So that could be the spread of fake news or falsity, more generally. It could be the escalation and contagion of bullying and harassment and even things like an epidemic of perfectionism. But one of the key things is that online we have an awful lot more power, both to do good and to do harm. And the real challenge in the ethics is to figure out how we can maximize our power to do good - you know, to grant access to education, to grant access to health care information, to run classes online, to do research and beyond - but also to minimize the risks. And those risks are far and wide, from things that end up creating mental health disorders to actually even physical violence that goes from the online world into the real world, as we've seen in the past couple of years. 

Dave Bittner: Historically, where have the guardrails come from when it comes to societies and how they approach ethics? 

Susan Liautaud: So there are a number of different places, depending on how far back in history one wants to go. I'll steer clear of religion, but clearly for some people, that is a source of what I call in the book guiding principles and a source of - you know, loosely described as right and wrong. Then there is the law, and in many cases, the law has been an effective guardrail. But the law is always going to lag behind reality. It takes a while to develop policy. It takes a while to get laws passed. And rightfully so. We should be thoughtful about regulation. But where we are now is that the gap between the reality, the technologically turbocharged reality that you're referring to in which we live, and the efficacy or even existence of laws is wider and wider every day. So the law is ineffective in most cases in governing technology, in putting up those guardrails that you're referring to. 

Susan Liautaud: So now where do we say - where do we get the guardrails? Well, we certainly need to be asking the companies who control the innovations to put guardrails in place. That's a bit fox-in-the-henhouse, right? I mean, that's a bit - you know, because there will always be, in their eyes, a question of guardrails versus growth and profit, to give you two words that we heard with Uber or we hear with Facebook and we hear with many other technology companies big and small. 

Susan Liautaud: But, you know, increasingly, I'm focusing on what I'm calling democratizing ethics, which means making ethical decision-making and ethical power available to people from all walks of life, whether or not we have formal education, whether or not we have expertise in technology. And at the end of the day, if we're not reading the fake news, it doesn't have an audience. If we're not spreading the bullying, if we are not focused ourselves on the photoshopped or whatever the most recent version, the Facetune'd (ph) Instagram photos, then we're not spreading some notion that perfectionism can exist and the mental health epidemics that go along with it. So increasingly, we need to look at our own participation as well in terms of guardrails. 

Dave Bittner: When I think of, you know, folks who are coming up studying things like cybersecurity or computer science and those sorts of things, I think it's few and far between, those who are required to take courses in ethics. In your view, I mean, is that a shortcoming? Is that something we do at our own peril? 

Susan Liautaud: Very much something we do at our own peril. Now, at places like Stanford and MIT and Harvard, there is a big push to make sure that the computer scientists are getting trained in ethics or, at least to some extent, that ethics are getting embedded in the curriculum in different ways. At Stanford, we have something called an ethical reasoning requirement that all undergraduates need to take, and that can be anything from one of my classes to perhaps a class that is more, you know, philosophy or some religious class that has a great deal of ethical reasoning. So I definitely think that we need to embed the ethics in the curriculum but not just for computer scientists. We need everybody. We need the humanists. We need the social scientists. We need the natural scientists to understand that all of the questions that we're facing today in society, whether or not we work in those domains, have a critical ethical dimension. And as citizens, we all need to be participating. 

Dave Bittner: You know, I fear that this question may come across as a little snarky, but I do mean it in good faith. And I guess what I'm wondering is, has there ever been a generation that didn't feel as though, compared to the generation before them, they were in some sort of moral decline? 

Susan Liautaud: That's a really interesting question, and I don't take it as snarky because I think the stakes of the generation of today's world, whether it's the generation that's coming through the university education that you're describing or those of us who went through that several generations ago, I think we face unprecedented challenge, and we face unprecedented potential for the spreading of moral decline through technology. So it may well be that each generation had its moral crises, its scandals, its sense that it was somehow failing or its recognition - rather reckoning with the past. Could be about slavery, could be about civil rights, could be about various aspects of human rights or even corporate behavior. You know, we've certainly seen over the past 20 years or so words like governance, accountability and, ultimately, ethics sort of took a while to come onto the horizon. But I would say that the distinguishing feature of today's world and, in particular, the upcoming generation is that technology turbochargers the decline of ethics. 

Dave Bittner: Can we touch on the issue of fake news and how ethics comes into that? I mean, it - I find it very discouraging that, you know, we see, I guess, what I would describe as willful ignorance. You know, you - or you find folks who, you know, are spreading misinformation for their own purposes, for their own profit, for their own, you know, whatever, philosophical reasons. And it's hard to not feel discouraged when you see that and the growth of that day by day. 

Susan Liautaud: So I'm very sympathetic to your sense that it's hard not to feel discouraged or worse. But let me take this at its broadest level. In my view, there is no such thing as ethics without truth. And if we walk through why that is, I provide in the book a four-part framework. It's four words that pretty much anybody who reads the first chapter will have mastered in about 45 minutes. But if we look at principles that we're all accustomed to, say integrity or accountability or honesty or responsibility, they all hinge on truth. 

Susan Liautaud: In addition, when we make decisions, we need information. Well, it's very straightforward. Garbage in means garbage out. We can't make ethical decisions without truthful information. We can't make ethical decisions without a truthful vision of who the stakeholders are that could be affected by our decision or who could affect our decision. And finally, we have no visibility about the potential consequences; the risks we may be triggering or the opportunities we may be either seizing or missing out on unless we have truth and we commit to looking to the world as it is. 

Susan Liautaud: I think this question of fake news is particularly dangerous today not only because of the technology, but as you're quite rightly alluding to, a willingness to weaponize it, a willingness to use it at best to our own advantage in certain cases and in other cases, to use it to really nefarious ends. So it is particularly dangerous today. It's not just little white lies. It's not just somebody having an affair and hiding it. It is fundamental truths about our society, whether it's the science behind the COVID vaccine or whether it's various other aspects about election-related information that is, you know, truly detrimental to the ethical fabric of society. 

Dave Bittner: Are there examples from history where societies have gone off the rails when it comes to ethics and been able to right themselves? 

Susan Liautaud: I think - first of all, as a general matter, I think ethical resilience - as you put it, being able to right ourselves - is very key. But to link that back to your question about fake news, in order to right ourselves, whether it's individually, organizationally or as a society, we need to be able to tell the truth and the whole truth. And then we need to take responsibility for that truth and commit to a plan to make sure that all the wrongdoing doesn't happen again. 

Susan Liautaud: I can't think right now of a particular moment that I would say, let's look to that moment; they fixed it; they did the full three steps of telling the truth, taking responsibility and making a plan and sort of executing it. What I would say about today is that, again, technology makes it much more dangerous because once things are out there, you never know where they're going to end up. So if somebody tells a lie today, it might end up in the - you know, on the screens of millions of people, if not more. Whereas, you know, in the era of Jane Austen, somebody might tell a lie in a salon and it's going to end up, at worst, sort of around the village. So we're in, you know, particularly challenging times as far as the truth. 

Susan Liautaud: But there's also something very fundamental here about the way individual citizens engage. And that is - and I'd be curious what you think of this - we seem to be in an era where people think - and in particular, the younger generation - that we can create the truth we want, that we can decide because we can program our world. We can decide what music we want to listen to. We can decide what TV shows we want to listen to just by, you know, pushing a button on our cell phone or on our iPad. And we can control - we have this false sense that we can actually control the world and that somehow the world owes us to be the way we want it to be. 

Susan Liautaud: Well, that's just not the way the world works. And the biggest example of that is climate change. You know, nature doesn't care what we think. Nature doesn't care about our fake news. Nature is going to do its own thing. And there's this wonderful ad in the UK on television, this very powerful - visually powerful ad that says, man needs nature; nature doesn't need man. It probably should have said human, but in any event, I mean, you get my point, which is that... 

Dave Bittner: Yeah. 

Susan Liautaud: ...We have a generation for whom, you know, not just laptops, but mobile phones, tablets are almost an appendage. And it's as if they, you know, feel that they can just decide that they want the world to be a certain way and the world owes them to be such. 

Dave Bittner: Yeah. I think that's a really interesting point. And I wonder - you know, it's - I guess, at the danger of falling into, you know, old-man-yelling-at-clouds... 

Susan Liautaud: Right. 

Dave Bittner: ...Zone, you know, I will see, you know, kids waiting for their friends to show up at a movie theater or something and seemingly unable to simply stand there and wait. You know, they have to be on their mobile devices. And I try to remind myself that, you know, just because something is different, that doesn't necessarily mean that it's better or worse, it may just be different. 

Dave Bittner: But I guess I worry about the algorithmic amplification of things and the ability for those algorithms to place people inside of bubbles, for the algorithms to sense that, oh, when I put this in front of you, you engage with it and so I'm going to put more in front of you. We often talk about, you know, on this show, like, just because you can do something doesn't mean that you should. And I think that, particularly for the social media giants, that's a trap they've fallen into; this notion of, you know, move fast and break things. Well, you know, if the only way I can run a - my successful factory is to pollute the river that runs next to the factory, well, maybe I shouldn't be running that factory. And I can't help wondering if that's where we are with some of the social media giants today. 

Susan Liautaud: Well, there are a couple of really important things in what you've just said. And if I may, I'd like to start with the example of the children waiting for a friend at the movie theater. 

Dave Bittner: Yeah. 

Susan Liautaud: That word wait is mission critical for ethics. And the number of times I suggest to people, take a breath, press pause, think, let's go through what - who are the stakeholders here? What are the actual and potential consequences, not just in the short term, but in the medium and long term? And that doesn't mean put roadblocks up against innovation. Sometimes it means be careful that we're not too conservative. Be careful that we think about the fact that we may not need an innovation in the U.S. - say, driverless cars - but in certain developing countries, if we don't get driverless cars, the astonishing rates of road deaths will continue because they don't have safe roads, because they don't have medical care, because they don't enforce rule of law like speed limits. So this idea of pressing pause and patience is something that the technology seems to be taking away from just basic growing up, to use your example of children in movie theaters. And then it becomes a habit for all of us, and things have to be faster and faster. We no longer wait for a fax to come. We expect it to come, you know, instantaneously by text or email or some other communications channel. 

Susan Liautaud: With respect to the social media companies, I couldn't agree more. I think that they are a particular set of ethics, opportunities and risks. And when we look at the astonishing statistics of teenage suicides, when we look at the epidemic of perfectionism - things like young girls creating selfies, sharing them on social media and then going to plastic surgeons and telling the plastic surgeon to make them look like their selfie or make them look like their friend's selfie - when we see the kind of bullying and harassment or manipulation of voting, manipulation of democracy, that is a whole other level beyond even many of the other technology companies. And I couldn't agree more with you that we have moved too fast. We have broken too many things. And even if we're looking at us as a cost-benefit analysis, we are not in the right place on that, in particular with social media. 

Dave Bittner: You know, as we wrap up our time together, I mean, do you have any advice for those of us who want to try to contribute to making this situation better? What's the best use of our time, our talents, our treasures to try to move the needle in the right direction? 

Susan Liautaud: So some of this is going to sound very basic, and it's a wonderful question in particular because, as I said earlier, my personal mission and the way that all of my different areas of ethics work comes together is democratizing ethics. It's saying that as citizens, as individuals, whether or not we're citizens, we have tremendous power. 

Susan Liautaud: The first point is to be careful with our own decision making. That can be decision making around how we use social media, but it also can be decision making around family and friends. Are we open-minded about having friends whose political views differ, you know, to the extreme from our own? Are we - or are we creating our own bubbles? - a whole wide variety of questions that I'm actually going be exploring in a new book. Then how do we use technology? You know, are we really thinking about the impact of using Spotify for free on the artists if we can afford not to use it for free? How much time do we spend on social media and what do we use it for? Are we using it for very select reasons? Third - very fundamental things about the way we participate in society, like voting. The ethics of our leaders come down to how much the citizens are going to vote for ethically inclined leaders and hold them to account. 

Susan Liautaud: And then finally, just in the workplace, for example, be mindful that ethics can become a habit. Almost every decision we make, we can, without much thinking about it, integrate ethics with just a little bit of training. And it's some of the work that I lay out in the book, but it's work that I do with organizations generally, and it just becomes a habit where at least we're sort of thinking as we're doing. And from time to time, we're saying to ourselves, wow, I really need to press pause. And then I guess finally, just commit to truth. Commit to truth and, you know, what products you're going to use, the people you're going to engage with, how you're going to be using social media, et cetera. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: I really liked how she talked about democratizing ethics, that this isn't something that should be handed down by some grand ethics board in the ivory towers of academic institutions or big tech companies - that this is something that we should, you know, all have a say in, in digital ethics. And, you know, I think democratization of anything is, at least on first glance, always a good idea. And for something as serious as ethics, it was just interesting to hear her say that, and it's something I certainly agree with. 

Dave Bittner: Yeah, absolutely. Well, again, our thanks to Dr. Susan Liautaud for joining us. We do appreciate her taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The Caveat podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.