Caveat 12.16.20
Ep 58 | 12.16.20

We don't know what we need, until we do with facial recognition.

Transcript

Jennifer Strong: The dreamers and techno-optimists who build things - you know, maybe it really shouldn't be their job to also regulate themselves.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland's Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben looks at potential antitrust suits against Facebook. I look at anti-hate speech laws in Germany that are serving as models for authoritarians around the world. And later in the show, my conversation with Jennifer Strong. She is the host of the new MIT Technology Review podcast "In Machines We Trust." While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we've got some good stories to share this week. Why don't you start things off for us? 

Ben Yelin: So my story is a Washington Post story from Tony Romm on their technology page. And one thing that's going to satisfy a lot of people emotionally is that Facebook is about to get sued... 

Dave Bittner: (Laughter). 

Ben Yelin: ...As part of a massive antitrust lawsuit. So more than 40 attorneys general across the United States - both Republican and Democratic attorneys general - are going to file, I think the day that we're recording this, a lawsuit alleging anti-competitive, unlawful tactics on the part of the world's biggest social media company, I believe. The particular allegation in this suit is about Facebook's purchases of Instagram and WhatsApp. They think that this represents a broader pattern of Facebook trying to neutralize competitive threats in a way that's a threat to a free market and that would violate our country's antitrust laws and our principles about fostering competition. And this is going to be a coordinate effort with the Federal Trade Commission, which is going to be going through its own enforcement procedures as it relates to Facebook, so sort of an all hands on deck against the social media giant. 

Ben Yelin: So they're seeking a bunch of different, pretty drastic remedies. Some of them would be forcing Facebook to sell off some of its business assets to address competition concerns. Whether that would be them letting go of Instagram or WhatsApp, it all depends on the particulars of the case. There's some rumor that, as part of this lawsuit, state attorneys general are going to petition a federal judge to require Facebook to inform them before proceeding with any significant future transactions. So this would be basically giving a judge a veto point over business transactions by forcing Facebook to disclose these potential transactions to state attorneys general. 

Dave Bittner: Now, by transactions, do you mean buying up other companies? 

Ben Yelin: Buying up and destroying other companies, yes. 

Dave Bittner: (Laughter) Do you think destroying is perhaps a strong term, Ben? 

Ben Yelin: Yes, it is a strong term. Let's be fair here. 

Dave Bittner: Maybe absorbing (laughter)? 

Ben Yelin: Yeah. So let's look at this from Facebook's perspective because I think that's - you know, that's the only fair way to do it. So Mark Zuckerberg has said that when Facebook purchases, like, you know, the Instagrams and WhatsApps of the world, he, Zuckerberg, and Facebook is allowing these companies to grow into viable services in a larger market. And he said that there are other competitors in the market, such as TikTok, which Facebook has not purchased, that are still able to thrive. 

Ben Yelin: The official statement from a Facebook spokesman - so kind of the company line on this, is, quote, "a strong, competitive landscape existed at the time with both acquisitions. This was looked at by regulators, et cetera. They rightly did not see any reason to stop these regulators." But I think what these attorneys general are going to allege - and it's backed up by significant evidence - is wide-ranging pattern of anti-competitive behavior, anti-competitive tactics, et cetera. 

Ben Yelin: As it relates to WhatsApp in particular, Facebook, when it purchased WhatsApp, promised that it would maintain the end-to-end encryption that makes WhatsApp so popular. But, you know, Facebook has kind of abandoned that as a goal. They've sought to integrate their user data with Facebook's other social networking services, which goes against sort of the spirit of WhatsApp, its raison d'etre. Pardon my poor French pronunciation there. 

Dave Bittner: (Laughter) The other side of this could be that when Facebook has purchased these companies, it's not like they purchased them in order to shut them down. They kept running. They kept doing the things they were doing. 

Ben Yelin: Right. It's not like Peter Thiel, you know, buying Gawker for the purpose of shutting it down. That's true. 

Dave Bittner: Right. And it's also not like they already had their own existing version of WhatsApp, and then they bought WhatsApp in order to shut down WhatsApp, to eliminate the competition that way. So, I mean, I suppose this is the broader definition of competition by anything that is a social media platform online. 

Ben Yelin: Yeah, I think that's part of it. I mean, that's going to be Facebook's defense, is we made these purchases as part of a mutual agreement with these companies so that we could increase these companies in prominence, you know, pay them off, use our leverage as the biggest social media giant out there to promote this product and to foster its development. And that's consistent with fair, competitive trade practices. 

Ben Yelin: You know, I think what the government is going to allege is that these are actually anti-competitive practices. You know, sometimes Facebook - because they have this trove of information on all of us and on these companies in particular, they've tried to weaponize it to quash potential rival developers, sometimes using kind of quasi-intelligence methods because, you know, oftentimes their competitors have users log in via their Facebook accounts. And so Facebook has access to that information. 

Ben Yelin: You know, and the lawsuit is also expected to mention the fact that Facebook is so ubiquitous. It occupies all of our eyeballs with its billions of users across the world. And when it purchases these platforms to collect a greater share of advertising dollars, which in turn allows it to augment its own power, it has more money it can invest in more smaller companies. 

Ben Yelin: You know, this is sort of what Teddy Roosevelt tried to do in the early 1900s in the Progressive Era - was to prevent this type of consolidation from happening, this exact scenario where one company becomes too powerful. They can leverage their financial assets and their influence to further erode competition. And eventually, this filters down and negatively impacts the user experience, and I think that's what the allegation is going to be here. Users are losing competition among social media services and also some of the features that they appreciated from some of these companies that Facebook purchased, like end-to-end encryption, that type of thing. 

Ben Yelin: Yeah. This is sort of the opening salvo, you know? And I think for people who are critics of Facebook, this is a day to crack open the champagne but not a day to spray champagne across the locker room as if... 

Dave Bittner: Right. 

Ben Yelin: ...You've just won the World Series. 

Dave Bittner: As you say, this is the opening salvo. Is there talk that this is the first of many steps? - because I would imagine there are folks who say, you know, this is sort of nipping around the edges, where Facebook proper is the real issue here. 

Ben Yelin: Yeah. My inkling here - and I think I'm right about this - is this is the start of what's going to be a 10-year to 15-year project on the part of regulators at both the state and federal level to cut against the anti-competitive practices of Facebook and other social media giants. These types of suits take forever because, you know, the extent of discovery that's going to have to happen to build a case like this is just so vast. There's going to be dueling motions. I mean, we're talking about attorneys general, and we're also talking about Facebook. I mean, they have pretty much unlimited resources to use every legal tool at their disposal to try and get this dismissed or, you know, to try and cut against some of these allegations. So we're going to be in for really a long process without any reasonable prospect for resolution in the short term. 

Ben Yelin: What that means is that this is as much a political statement as it is a legal filing. It's sort of a shot across the bow to Facebook, saying, you might, in the future, have to face some accountability. And in turn, you know, that might inspire Facebook to check itself as it relates to its anti-competitive practices. And that might be sort of some of the motivation here. Even if this particular case is going to go on for 10 years, when, you know, the federal judge who's been assigned to this case has retired and... 

Dave Bittner: Right (laughter). 

Ben Yelin: ...Moved to The Villages in Florida or something, I still think this is - so much of this is symbolic. And... 

Dave Bittner: Second term of Eric Trump's presidency (laughter). 

Ben Yelin: Yeah. I think we might be getting into Barron Trump's presidency... 

Dave Bittner: OK. 

Ben Yelin: ...By the time this case is resolved. 

Dave Bittner: Fair enough. Yep - fair enough. 

Ben Yelin: So, yeah, this is the start. You know, I think it's important and notable that, perhaps for different reasons, this has some bipartisan appeal. It's being led by Tish James, who is the attorney general in New York state. Obviously, she's somebody who comes from the liberal side of the political spectrum. But there are a lot of very conservative AGs - attorneys general - across the country who are joining this lawsuit and are just as skeptical of Facebook and critical of its anti-competitive practices. So giving it a bipartisan veneer, I think, separates it from a lot of other political and legal issues where people kind of retreat into their own ideological corners. And that might give this lawsuit a little bit more staying power. 

Dave Bittner: Now, looking at the big picture here, you know, my recollection from, you know, back when I was a young lad was the breakup of AT&T. Is that the last time we've had the government coming at something this big? - because it seems to me like certainly lately, there's been very little pushback on consolidation of power like this. 

Ben Yelin: Yeah. I mean, I think the biggest case in relative recent history that might measure up to this was U.S. v. Microsoft back in the very early 2000s. That was probably the most high-profile case under our antitrust laws, the Sherman Act, back in 2001 in front of the D.C. Circuit. And that was, you know, probably one of the first antitrust cases dealing in this digital world. 

Ben Yelin: But it has been a long time, and we don't trust-bust the way that it was envisioned some 110 years ago. Particularly since the 1980s, I mean, we've given corporations in all industries a lot of leeway to consolidate, to purchase smaller companies, all in the name of supporting the free market. So you're right that this really isn't something that's very common. We don't see a lot of antitrust cases in the technology industry or, frankly, in any industry. So there is something that's kind of novel about this. 

Ben Yelin: You know, I'd note that if I were a dictator or if - you know, God forbid - or if somebody who was smarter than me were a dictator and I could influence them... 

Dave Bittner: (Laughter). 

Ben Yelin: ...I'd instigate a lot more antitrust lawsuits in a lot of different industries because I think there are a lot of anti-competitive business practices out there. 

Dave Bittner: Yeah. 

Ben Yelin: There are a lot of reasons why that does not happen, but it makes it that much more significant when we do see a case like this. 

Dave Bittner: Yeah. I have to say personally, I'm very cynical about this sort of thing. I mean, I think about - you know, whenever we have these consolidations, I think about, you know, cable TV companies buying up their competitors and broadcast companies and publishing companies. And they all go before the regulators, and they say, this will be good for consumers. And that never happens. It's never good. We always end up with higher fees, terrible customer service. I don't know. It's like Charlie Brown with the football, you know? 

Ben Yelin: Yes, absolutely. 

Dave Bittner: (Laughter). 

Ben Yelin: You're a hundred percent right. My frame of reference - and those are both good frames of reference. For me... 

Dave Bittner: Yeah. 

Ben Yelin: ...It's always the airline industry. 

Dave Bittner: Oh, yeah, yeah, yeah. 

Ben Yelin: I think part of it is that, you know, I'm not that old, but as a kid, air travel was very different because there was a lot more competition for domestic air travel. So there weren't things like checked baggage fees, all the sort of nickel-and-diming... 

Dave Bittner: Right. 

Ben Yelin: ...That happens now. Flights were more comfortable. You weren't packed in like sardines. 

Dave Bittner: People used to dress up to get on an airplane (laughter). 

Ben Yelin: Right. And I think airlines really valued customer service because... 

Dave Bittner: Right. 

Ben Yelin: ...As they used to say - you know, I know on United, the announcement would always be, we know you have a choice of airlines. Turns out we don't really have much of a choice of airlines anymore. 

Dave Bittner: Yeah. 

Ben Yelin: There's, like, four main domestic carriers because they all, you know, purchased up their competition. And, of course, the consumer has suffered. Yes, as a general proposition, you know, flights are probably cheaper than they were... 

Dave Bittner: Yeah. 

Ben Yelin: ...A generation ago. 

Dave Bittner: Yeah. 

Ben Yelin: But in terms of the user experience, it is considerably worse. And that's what happens with industry consolidation. And so I think that's the fear here - is that, particularly in a context of protecting our privacy, once Facebook controls the entire market, they have less incentive to care about the user experience. 

Dave Bittner: Right, right. It's like that old Lily Tomlin comedy routine. 

Ben Yelin: We don't care. Yeah. 

Dave Bittner: We're the phone company. We don't have to care. 

Ben Yelin: Exactly. 

Dave Bittner: (Laughter) All right. 

Ben Yelin: That's exactly what it is. That's exactly what it is. 

Dave Bittner: Yeah. Yeah - good times, good times. All right. Yeah - interesting story. And we will buckle up and - for the next decade or more to watch how this plays out, right (laughter)? 

Ben Yelin: Yeah, we're going to have to keep this podcast going for a decade so we can follow... 

Dave Bittner: Yeah. 

Ben Yelin: ...The progression of this case. 

Dave Bittner: Oh, boy. Yeah, buckle up, buttercup. Here we go. All right. My story this week comes from the folks over at Foreign Policy. It's titled "Germany's Online Crackdowns Inspire the World's Dictators." This is an interesting case of perhaps unintended consequences, and I'm really looking forward to your thoughts here, Ben. 

Dave Bittner: So the German government was dealing with a bunch of right-wing extremist violence. And one of the ways that they push back on this is they enacted tough new measures against online hate speech, and part of this involves having restrictions on the social media companies like Facebook. And initially, Facebook had agreed to a voluntary deal where they would remove content that Germany had deemed illegal within 24 hours. But - surprise, surprise - Facebook was a little slower than the German officials would like them to have been (laughter). 

Ben Yelin: Man, we're really ganging up on them today. 

Dave Bittner: I know. Well, gosh. 

Ben Yelin: I have a feeling they're going to delete some of my pictures in retaliation or something. 

Dave Bittner: Right, right, right, exactly - poor Facebook. 

Ben Yelin: Yeah. 

Dave Bittner: Poor Facebook. They enacted a new law which imposed what they call an intermediary liability for social media networks. And with this, any content which is manifestly unlawful must be removed in a timeframe of 24 hours. For all other unlawful content, the deadline is seven days. And failure to remove illegal content is punishable by fines of up to 50 million euros, which is about $55 million. 

Dave Bittner: Now, the folks who look out for these sorts of things - they say that this gives the private sector the role of policing this sort of online free speech and there isn't any transparency or due process. And what's interesting about this is that some authoritarians around the world, countries who are not so interested in free speech, have basically taken this example from Germany and copied and pasted it into their own rules for online discourse and are using it to clamp down on free speech. So, you know, again, the folks who are interested in free speech around the world - they're saying, be careful what you ask for. This is - Germany went at this with all good intentions, of course. But be careful of unintended consequences. What do you think here, Ben? 

Ben Yelin: It's a fascinating story. I mean, I think we have to look at this from a couple of different angles. Germany comes from a unique place in their political history. 

Dave Bittner: Right. 

Ben Yelin: As part of their kind of denazification process, they have relatively - not anti-free speech statutes or practices, but they have a culture of prohibiting speech that evokes its most morally objectionable period. So, you know, you are prohibited from spewing Nazi propaganda in Germany in a way that you are not prohibited from doing in the United States. 

Dave Bittner: Right. 

Ben Yelin: And that's part of their political culture born out of their history. So I really do trust that they had good intentions here. And we talk about all the time the danger of disinformation or abuse and harassment that comes from online communication. So I think you have to respect - you know, if not the means to achieving this goal, you have to respect the goal in and of itself. 

Ben Yelin: But I think the broader lesson here is the slippery slope once you institute these types of content-based restrictions on speech. And that's why our political culture is so gung ho about protecting our First Amendment rights and protecting against these types of content-based restrictions where the government or the private sector are regulating certain categories of speech because that can lead you to a place, like this article mentions, where governments might start censoring criticism of themselves or other information that they think would be harmful for the public to see because, you know, that might cut against the government's own propaganda efforts. 

Ben Yelin: So I think there really is a lesson there. It is a law of unintended consequences. I don't think when Germany passed this law, they were considering that authoritarian countries would use this as a model in their own countries to stifle opposition, to stifle criticism of the government. But that is the logical endpoint of many of these types of laws. And it kind of - despite all of the trouble that comes with it, it kind of makes you appreciate our own political culture, which is generally very skeptical of these types of laws. 

Dave Bittner: Yeah. One of the things this article points out that was interesting to me is that this provides cover for those authoritarian governments. They can say, look; we're doing the same thing that Germany does. We're not doing anything different. Or Germany is a country that's committed to democracy and the rule of law and human rights. And, you know, we've pretty much just copied their policies. So to the rest of the world, you know, what's your beef here? We're just following the pattern of powerful democracy from Europe. 

Ben Yelin: Right. This is an advanced Western democracy. They're very enlightened. All we did is copy and paste their statute. I mean, so much of it is going to be in the enforcement of it and what kind of speech is considered the type of speech that could be censored on social media platforms. That's really where the leeway comes in. So you know, that's why Germany parsing out this language in the statute, if it is going to be adopted by authoritarian countries, that's when it becomes dangerous because I don't think Germany has the intention or even does enforce this in a way that stifles legitimate political opposition. 

Ben Yelin: But the plain letter of the law would allow governments to do this, and that's exactly what they have been doing in countries like - Venezuela, Vietnam, India, Russia, Malaysia, Kenya are the ones that they mentioned in this article. And you know, the definitions that these governments come up with of the type of speech that can be banned ends up being overly broad. They talk about a Russian bill that was signed into law by Vladimir Putin in their purely constitutional process that bans, quote, "unreliable information." And that definition, you know, is one of the broadest - some of the broadest legal terms of art that I've ever heard. It's socially significant information disseminated under the guise of reliable messages which creates a threat to life or the health of citizens or property, the threat of mass disturbance of public order and/or public safety or the threat of creating or impairing the proper operation of vital elements of transport or social infrastructure, credit institutions, energy facilities, industry and communications. That can pretty much encompass any type of speech that the government might want to regulate. 

Dave Bittner: Yeah. 

Ben Yelin: So it really is overly broad. And again, I think the point of this article is they can say, well, we're not doing anything different than this enlightened first-world democracy in Germany. So you know, I think the fact that Germany's coming to this with the best of intentions, as you say, it really is the law of unintended consequences here. 

Dave Bittner: Yeah. All right, well, as always, we'll have links to the sources of our stories in the show notes, so we encourage you to check that out. It's time to move on to our Listener on the Line. 

(SOUNDBITE OF PHONE DIALING, RINGING) 

Dave Bittner: Our note this week comes from someone who has requested anonymity, so we will respect that. And they write, I live in South Africa, and they're in the process of passing a new cybercrimes bill. Essentially, what they are proposing, if I understand it correctly, is that they want to hold people liable for the messages they send on messaging services. This is all good and well. But what baffles me a little about this process is how they would get access to these messages. They state that the bill also imposes obligations on electronic communication service providers and financial institutions to assist in the investigation of cybercrimes. Surely this is not feasible because if WhatsApp is really making use of end-to-end encryption services, they shouldn't be able to access your messages. I'm not asking for a critique of our regulations, as you wouldn't be familiar with them, but rather, I'm using it as an introduction to my question below. 

Dave Bittner: Question - regulations regarding the privacy of instant messaging platforms are probably highly specialized to a country or state. So I just want to know what these processes look like in the U.S. If I'm a victim of verbal abuse via WhatsApp, I can surrender those messages willingly. However, if I am in a group and get approached by law enforcement asking for screenshots of chats, what recourse would one have to deny the request? What about other institutions, such as your employer, making such a request for an internal investigation? The question Ben would pose is, what does a reasonable expectation of privacy look like on messaging services, not public forums like Twitter, but private ones? I suppose one would need to be very careful when denying a request because if someone else obliges such a request, you may be able to be seen as complicit in whatever act was being perpetrated, possibly indicating that we don't have a reasonable expectation of privacy. 

Dave Bittner: All right. Ben, there's a lot here, but I think it's a very interesting question. What do you make of this? 

Ben Yelin: Oh, it's a great question. I'm very glad this listener wrote in. So a couple of things here - the great thing about WhatsApp and its end-to-end encryption is WhatsApp itself doesn't have access to these communications... 

Dave Bittner: Right. 

Ben Yelin: ...Which means the government can ask for it. You know, you have no obligation to disclose those messages in any legal sense in most circumstances. And the third party doesn't have access to these communications themselves, so they are completely out of luck as it relates to those end-to-end encrypted applications. 

Ben Yelin: Even on other services where you don't have end-to-end encryption, you know, just your email transactions, the prevailing wisdom in our legal system is that you do have a reasonable expectation of privacy in the content of your communications. I think the most prominent court case came from a circuit court case back in 2010 called Warshak v. the United States, which generally held for the proposition that you do have Fourth Amendment rights in the context of the communications of your emails, of your other stored online communications, and the government generally needs a warrant to access those. 

Ben Yelin: I should mention there has been a legislative effort as it relates to both just your standard stored communications and, frankly, encrypted applications to cut against these rights and to give the government a mandatory backdoor to access these communications for law enforcement purposes. We've talked about legislative efforts in the United States Congress to do that. I know that's something that's been supported by the current attorney general of the United States, William Barr. So this is a dynamic area of the law that's subject to change. But for practical purposes, I mean, you do have a reasonable expectation of privacy in the content of your communications. That's something that our court system has emphasized repeatedly. 

Ben Yelin: I mean, a case I always think of in the context of these questions is Riley v. California, where the Supreme Court said the government needs a warrant to access a cellphone incident to arrest because, you know, cellphones are basically part of our bodies. They contain all of our deepest and darkest secrets. And so we certainly have an expectation of privacy in the contents of our cellphone. And I think that's a principle that's been pretty strongly adopted by the Supreme Court. 

Dave Bittner: Well, there's a couple other details here that this listener is asking about. First of all, what about in a group chat? Is there anything different that comes into play in a situation like that? 

Ben Yelin: It's not different in a group chat in terms of access by law enforcement. I mean, the risk you run in a group chat is that members of the group have access to those encrypted communications. 

Dave Bittner: Right. Somebody is going to rat you out. 

Ben Yelin: Someone's going to rat you out, yeah. 

Dave Bittner: (Laughter). 

Ben Yelin: And you have no reasonable - you've lost a reasonable expectation of privacy. In other words, you should not trust your friends in the group chat. 

Dave Bittner: Right, right (laughter). 

Ben Yelin: We've talked about these types of cases a million times. But I mean, this is a principle that developed where - the Mafia cases back in the 1960s and 1970s where, you know, people would divulge incriminating information to their buddy who was wearing a wire. They'd say, well, you know, this is a violation of my Fourth Amendment rights. You have no control over the person that you tell information to. Once you've told them that information, you know, they're going to do what they're going to do. If they want to rat you out to the government, you're basically out of luck in those circumstances. If every member of the group, though, wants to protect those private, encrypted communications, then those same constitutional rights that I discussed certainly still apply. 

Dave Bittner: How about for the employer for an internal investigation? 

Ben Yelin: That gets, you know, a little bit more complicated. I mean, you generally have fewer rights as it relates to your employer because, you know, there could be some term of your employment that grants them access to all, you know, online communications in the course of your employment. And so when we're talking about end-to-end encrypted applications, I don't think you have any obligation to retain those if you don't want to. I don't think your employer would have a cause of action, you know. But I think there is more of an expectation when you're talking about one's individual employer that you don't have as much of an expectation of privacy, that they do - particularly when we're talking about things you do in the course of your employment, that they might have access to those communications. You have to be a little bit more careful in that context. 

Dave Bittner: Right. Certainly if you're using a device that belongs to them, if you're using, you know, an email account that is provided by them, then all bets are off. 

Ben Yelin: Yes. Yeah. And it's a little bit different when they're trying to get information from a device that they don't own, you know, your own personal device that maybe you were using for work purposes. So much of that gets into, you know, just the terms of employment that come in - you know, that brings in elements of contract law. So that gets a little bit more complicated when we're talking about protecting information from your employer. But you know, generally, you have a lesser expectation of privacy relative to your employer than you do relative to the government, certainly. 

Dave Bittner: Yeah. All right. Well, our thanks to our listener for sending in those thoughtful questions. Good stuff. We would love to hear from you. You can call us and leave us a message. It's 410-618-3720. We might use it on the show. You can also send us a message to caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had a great conversation with Jennifer Strong. She is the host of the MIT Technology Review podcast "In Machines We Trust." Our conversation focused on AI and some of the things that she's tracking with the folks on that podcast. Here's my conversation with Jennifer Strong. 

Jennifer Strong: I wanted to make a show about AI, or artificial intelligence, for everybody for a few years now and met my editor-in-chief, Gideon Lichfield, at a conference. And I told him - I said, I want to make the "Planet Money" of AI. And he said, OK, fine. So you know, it was a hard sell - right? - because AI is one of those topics where people - it's like their ears just close, and they kind of tune out. It's not - the assumption is, it's not for me. It doesn't really apply to me. And you and I both know that's not true. So that's where I went just to make a show for everybody about AI. 

Dave Bittner: One of the things that strikes me about AI - and I notice, like, you all did a special episode to just describe what AI is. And I think in the general public's mind it's a very fuzzy concept. Is that a bit of an uphill battle for you of putting guardrails on what we're talking about here? 

Jennifer Strong: Yes and no. I think if you think about it for a minute, like, AI is now being used to automate decisions about all these areas of our lives. So I think with, like, the general public or when I call and explain this to my dad, it's like, no, no, this decides who goes to jail on some level, right? It helps inform who we arrest and who we give bail to and whether you get a mortgage or, you know, whether you look like you're cheating on an exam, whether you get into college. So databases that look at you as you're shopping for groceries and decide whether you might be a shoplifter from some database somewhere - like, I don't think it's that heavy of a lift if you just talk about what it means to you and talk about people who've been impacted, right? I think that's all these stories. 

Jennifer Strong: But that's my - I'm not a tech person. I covered policy for public radio is my background. And, you know, you get sent out on these stories like budgets - right? - or policies that may mean a lot to someone and nothing to the other. These are all people stories. So if you make anything a people story, then it's an everybody story because we're all people. 

Dave Bittner: Yeah. One of the things that you point out on the show is this reality that so much of what's going on behind the scenes with AI is happening in private. It's not in public view, that these are largely private companies who are cooking up these algorithms in secret. And that may be at our own peril. 

Jennifer Strong: Yeah. I mean, I don't think any of it's malicious, right? I just think that we're an interdisciplinarity world in ways we never have been. Like, you go into a lab at pretty much any university, and you see all these different folks working together who, you know, not that long ago it just would have been unheard of. And so now it's like - it's not really - you know, judges don't build algorithms, and the folks who do aren't judges. But yet, these things - who's in charge, I guess? And whose job is it to regulate these things or talk about these things or disclose these things? And I think that's what we're going to be struggling with. We have been for a decade, right? You go back to, like, Eric Schmidt, you know, a good - like, 2011 - right? - when he was saying, oh, it'll be - in democracies, this will all be regulated really quickly. And here we are in 2020, all saying, hey, we should really regulate this stuff. 

Jennifer Strong: I think once we decide, like, who - like, where the buck stops - right? - and how to approach it, it's just a question of when that happens. 

Dave Bittner: You know, I think back that there was a time here in the U.S. before we had a Food and Drug Administration. And you know, anybody could mix up their elixirs and distribute them. And because of that, some people died, and now we have a Food and Drug Administration. And I don't think it's too far off to wonder if we need something like that for some of these social media platforms to say, before you release this to the general public, we need to test it. We need it verified. We need someone with authority to look over it and make sure it's not going to do some damage. 

Jennifer Strong: Yeah. I make a point a few times in the show that, like, cars had to exist before we knew we needed seatbelts. And I think that's going to be the way here, too. There's a lot of things we don't know the outcome until they're created, right? We don't know what we need until we do. A lot of this stuff raises questions about consent - a lot of it, to me anyway. Like, I think as a tech reporter who writes about face ID and who lives here with kids and drives around, I didn't know about face ID being used at the toll booths that we drive through all the time until I did, right? And now that I do, I can't unsee it. And I just - I think there's going to be an aha moment for probably most folks where they realize that they need to participate, ask questions. And that's really all I'm trying to do, is just let people know it exists, find out what exists and start a conversation. 

Dave Bittner: Well, and several of your episodes go through facial recognition and how it's being used. Can you take us through some of the things you learned there? 

Jennifer Strong: Well, we did four parts on facial recognition and policing. And it's such a meaty - truly a meaty topic for us. We've gone and we visited other topics, like emotion AI and looking - COVID tracking. We've worked on other topics as well, but we're returning to do four more on face ID just because, again, it's a meaty topic for us because it's easy to show people where it is and what it might, you know, mean to them personally, like where they'll find it. 

Jennifer Strong: Anyway, in terms of what we learn with policing, we learned is that police departments are struggling with this topic in towns big and small. And they're all coming to different conclusions, and there's not any, like, overarching policy or set of plans in place. Some folks are fine using Clearview, which operates by comparing their images against our social media images that, you know, we don't keep private. Others don't think that that's OK, and so they won't use Clearview. Some just build their own databases. Like, one of the oldest databases is in Florida. 

Jennifer Strong: You've got some folks who will use celebrity lookalikes, meaning - there was a beer thief in New York City. The guy - they couldn't get a match when they ran photos from the cameras. I mean, you think also with face ID - like, it really works if you have a face-level, straight-on, well-lit image. And we all know that's just not how, like, the camera up on the ceiling works, right? So sometimes - or if somebody is blinking or - it's like - I don't know if you ever use the software in your phone, like, you look through your images and it'll point out the faces and sometimes it doesn't see the face. Well, if somebody is blinking, yawning, in profile, it may not see the face at all. 

Dave Bittner: Right. 

Jennifer Strong: So some folks doctor the images. Some hold up celebrity lookalikes. So in the case of this beer thief in New York City, the police were like, hey, he looks a little bit like Woody Allen. They couldn't get a match with the face, but when they held up Woody Allen's face, they found a match, and it eventually led them to this guy. So some places are fine with that. Some places are really not fine with that. And it just gets back to that kind of the Wild West right now. We - and who, in the end, gets to decide what is fair play? 

Dave Bittner: And there are real stakes here. I mean, you go into the story about the gentleman who was falsely accused based on facial recognition. He didn't do it. 

Jennifer Strong: No. No, not only did he not do it, he - this was a man outside Detroit who gets a call at work. You know, he's now fairly well-known right after this made some pretty big headlines this summer. But anyway, back in January of this year, he's at work, and he gets a call from police saying, come turn yourself in. And he was like - he thought it was a crank call. But it wasn't a crank call. And when he got home from work, there were police in his driveway. And they blocked him in, and they arrested him in front of his kids, and they took him to jail. And he had no idea what was going on. It was for a crime - stealing watches - a crime he didn't commit. He was there overnight before they even told him what this was about. And it was kind of an accident that they let him know that they'd used face ID to name him. 

Jennifer Strong: And you know, Detroit has since decided, we're only going to use this for violent crimes. Also, it sounds like the officers didn't use it the way it was intended. Like, you know, quoted to me several times have been folks saying, well, just because I call and say I think my neighbor did this thing, it doesn't mean you go arrest this person's neighbor. Like, you're supposed to take some more steps here, folks, to be sure you're going to go put cuffs on someone who you believe committed something beyond the computer said so. 

Jennifer Strong: There are states, too, I think - if you're in a store and something goes off saying that you've previously stolen - you know, like, a shoplifting database suggests that you might be a match to somebody they have or, you know, if you're unable to get benefits. And something else that's of interest to me now that I'm only starting to dig a little deeper into - you know, homeless services, you need to identify someone coming into a shelter just for safety, but also before you can start any other type of process with these folks. They're unlikely to have a bunch of documents on them or ID. So face ID has been used for a while in Calgary, you know, Canada, in particular. OK. Well, what does that mean? How - what kind of rules can we put around this so that things are done fairly? 

Dave Bittner: Well, and even basic privacy, the ability to be anonymous - you know, I could think of, you know, folks going to a political meeting or going to - you know, someone going to a gay bar or a medical clinic, something like that. You know, between Point A and Point B, who knows how many times they've walked by some sort of camera that IDs them and puts a pin in the map that they were in this location? 

Jennifer Strong: Well, we talk about that, too, in the first four parts in the series. Or what if there is a camera outside of an Alcohol Anonymous meeting - an Alcoholics Anonymous meeting door, right? Also with the protests, there are a number of examples, as well. You think about, too, the difference between public property and private property. Like, when you're in a park that's adjacent to the mall - right? - as you walk, as I did in London with one of the sources, we were talking about London's trials of live face ID. And OK, well, what happens - as you know, what happens with data when it's public data versus private data, you know, can be completely different things. So just crossing the street can have different ramifications for what happens with, you know, data from your face, with your biometric data. And that's, you know, in a place that has GDPR. So what does it mean in the U.S.? Nobody knows. 

Dave Bittner: Is there a basic mismatch between the rate at which policy is able to keep up with technology? It seems to me like - we say this, you know, technology is getting faster, and the rate of change seems to be increasing also. But I don't think we necessarily see that tracking the same way on the policy side of things. I mean, we make jokes about how our representatives are generally older people who may not be in touch with the bleeding edge of technology. Do you track that issue as well? 

Jennifer Strong: I think that's always been an issue, right? Like, going back to that whole seatbelt example, that cars had to exist and we had to know what - you know, that we needed them - needed seatbelts before they could be invented. So I don't think that's new. Yeah. Is it a problem? But getting back to that, who is supposed to be the one to know what it is we need? Is it folks who call it the Facebook, or are we letting the people creating the technologies decide how best to regulate what it is they make? Folks at Georgetown Law have pointed out, rightfully, you know, the push comes from industry. But is it really a push for regulation or just a push to know how it is they will be regulated so that they can build it into their business plan? Like, I think the reason that we don't see more policy is we don't yet know who it is who's going to be responsible - right? - for coming up with it. 

Jennifer Strong: And so these interviews that I've done with these tech founders and CEOs who've created a number of these products that are now being used at scale - I think of, like, the Intek Labs creator in Moscow has created one of the largest live facial recognition systems in the world. You've got 100,000 cameras looking at a lot of different kinds of data at once, right? You're not just running face ID, but you're also scanning license plates, you're looking at how close people are standing, whether they're wearing a mask. Like, there's a ton of different things you can pull through the same video feed. And there's, you know, a billion faces read in a month there. 

Jennifer Strong: So anyway, his line was, well, you know, it's up to people to decide what kind of world they want to live in. And that's kind of the same from Clearview. It's like, you know, we're just here. We'll do what everybody says. It should be regulated. Hey, maybe somebody will regulate it. OK, well, is it really up to you and me? Maybe. 

Dave Bittner: Yeah. I mean, I guess it's easy to be cynical and think about - you know, compare to someone, you know, dumping chemicals into a river or a lake or something and saying, you know, well, maybe somebody should regulate this, you know(laughter). Do you - are you not aware that you're harming the river here? That is a cynical outlook, but I don't think it's completely off base. 

Jennifer Strong: Well, yes. And the dreamers and techno-optimists who build things - you know, maybe it really shouldn't be their job to also regulate themselves. 

Dave Bittner: What has this experience been like for you as you've been on this exploration and talking to the people you've been talking to? Are there any things that really stuck out to you in terms of the things you've learned or maybe changed the way that you look at things? 

Jennifer Strong: All right. Well, I'm an inherently curious person, so we'll just start from there, and new to a number of these things. Like, I was not covering specifically AI five years ago. The tape from Governor Cuomo talking about ear recognition - that was definitely one of those moments where I had to pause and rewind and say, wait, did he just say that we're trying to read people's ears? And yes - and apparently this is not new. I - you know, people have been trying to work on this for quite some time. But if you're driving through a tunnel and you're a passenger in the windshield in profile, apparently reading ears could be useful. 

Jennifer Strong: There have been a number of moments where maybe it's captured the imagination. Maybe it's a little creepy. Maybe it's just the fact that I'm kind of nerdy and interested in everything. But mostly, again, I just - whether it was the cameras in stores, the cameras going through - actually driving across the sidewalk, across the street to a different sidewalk, and suddenly, your data is going somewhere totally different. 

Jennifer Strong: Thinking about what this could mean for anything, like your social media profile impacting your insurance rates, like, what it means for your kids - all of our kids on Zoom school right now, like mine in the other room here, and realizing that when their tests are being proctored - right? - there's a decent chance that AI is deciding whether or not they're being honest. Thinking about ways in which it could be super useful for our cars to know that we might be drowsier, that - you know, if they could engage with our Fitbits or our smart watches or whatever else and know that maybe we're a little short on sleep or that we're in a hurry and help us out. The flip side is like, OK, five years from now, what does our data trail look like, and what does it mean? 

Dave Bittner: Right. And who's in control of it? 

Jennifer Strong: Right. And I think, at least the path we're on, the answer is nobody and nothing, maybe, and everything. It sort of depends. That's what gets back to the heart of why I want us to have all these conversations, not because I claim to have any answers or suggestions for the way forward. I just want us to all be having the conversations so that maybe we can come up with something. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: You know, some of these interviews are uplifting... 

Dave Bittner: (Laughter). 

Ben Yelin: ...This one, not so much. 

Dave Bittner: (Laughter). 

Ben Yelin: It kind of got into a dark place. That was an excellent interview, and she gives some great insight. She's obviously smart in her field, highly recommend her podcast. But it kind of left me in a place where I worry about where we're going to be vis-a-vis our private information in five to 10 years. 

Ben Yelin: I thought the metaphor she gave about how we needed to drive cars before we knew we needed seatbelts was just very apt as it relates to tech companies collecting our private information. We need to use them before we know how much privacy we need. And that leaves us all vulnerable, particularly when we have policymakers that are not very dynamic, that don't adapt along with technology, where there's often this significant lag time. So you know, I don't know about you, but it certainly didn't leave me in a happy place. 

Dave Bittner: (Laughter) No, I mean, a cautionary tale - but as you say, great to have Jennifer on our show. I highly recommend the "In Machines We Trust" podcast. Really well-done, really interesting stuff over there. So worth a listen. check it out. And we appreciate Jennifer taking the time to join us. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the start-up studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.