Caveat 7.7.22
Ep 132 | 7.7.22

Is the American Innovation and Online Choice Act beneficial?

Transcript

Matt Kent: So it's really, like, an instance, at least from a competition standpoint, of the laws not keeping up with sort of where we are in terms of the digital ecosystem right now.

Dave Bittner: Hello everyone and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses a new algorithmic model that claims to predict crimes before they happen. I've got the story of legislators in Japan looking to open up big tech algorithms. And later in the show, Ben's conversation with Matt Kent. He's competition policy advocate at Public Citizen, and they're discussing the American Innovation and Online Choice Act (ph). While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's jump into our stories this week. Why don't you start things off for us? 

Ben Yelin: So mine comes from Bloomberg News by Carrington York and is entitled "Algorithm Claims to Predict Crime in U.S. Cities Before It Happens." 

Dave Bittner: Oh, that old chestnut (laughter). 

Ben Yelin: I know. And you can already sense my - the skepticism in my voice. 

Dave Bittner: Right (laughter). 

Ben Yelin: So we've previously talked about these predictive policing algorithmic models. 

Dave Bittner: Yeah. 

Ben Yelin: The most famous was one developed in 2012 by the Chicago Police Department and other academic researchers. It was called the Crime and Victimization Risk Model. And it produced what were called strategic subjects, so potential victims or potential criminals, based on a bunch of factors - so things like age, arrest history, location. And that determined a score of how urgently certain people needed to be monitored or protected, as either a victim or as a perpetrator. There was an investigation into that model about five years later, in 2017, that - basically, a majority of the people identified through the algorithmic process had no connection to crime at all. There was racial bias in how the formula worked. It was basically completely ineffective. 

Dave Bittner: Yeah. 

Ben Yelin: So there are some entrepreneuring (ph) - entrepreneurial, rather, social scientists at the University of Chicago who wanted to try and do it again. 

Dave Bittner: (Laughter). 

Ben Yelin: And I don't mean to disparage social scientists at the University of Chicago. My dad is a social scientist who went to the University of Chicago. 

Dave Bittner: Oh, wow. OK. 

Ben Yelin: So I hope he doesn't take this personally if he listens. 

Dave Bittner: (Laughter) OK. 

Ben Yelin: But this is a new type of predictive policing model, and they tout 90% accuracy, which, I'll note, seems kind of dubious to me. 

Dave Bittner: All right. 

Ben Yelin: But we'll get to that in a moment. So this algorithm, instead of looking at past arrest history or quantifying certain characteristics of individuals, divides cities into 1,000-square-foot tiles. And the researchers used historical data on violent crimes and property crimes to test the model. It detects patterns over time to predict future events. 

Dave Bittner: OK. 

Ben Yelin: And they tested their model using kind of back-dated analysis. So if they had been using their model to predict crimes, property crimes or violent crimes, in various cities - and it worked relatively well, the study claimed. So they tried it in Atlanta, Los Angeles, Philadelphia - all about 90% effective. I have a couple problems with this. First of all, if it's - if this is something that's going to be used by law enforcement, then the government is getting involved and could be introducing tactics similar to the ones we saw in 2017, where certain neighborhoods are targeted based on their characteristics. It's hard to separate a 1,000-square-foot tiled neighborhood from the demographic characteristics of that neighborhood. So you kind of run into the same problem, where you are maybe introducing an additional level of policing based on either racial characteristics or socioeconomic characteristics... 

Dave Bittner: OK. 

Ben Yelin: ...Without actually delving in to solve those societal problems. 

Dave Bittner: OK. 

Ben Yelin: So that's certainly one potential concern. The other thing is, when designing any study like this, there is some human involvement in designing the algorithm. So it's just impossible to design something that doesn't have a bias when a bunch of social scientists sit in a room and determine, what are the key factors that determine where violent crime and property crimes are happening? So, you know, this algorithm might be better because it introduces some more complicated calculations, like, for example, they mention, what happens to the rate of violent crime if property crimes go up? That is interesting. All of this is interesting from an academic perspective. Where we've seen this fail in the past is, you know, we have a lot of cities that are dealing with a very serious violent crime problem. 

Dave Bittner: Yeah. 

Ben Yelin: And they are going to be looking for any tools to help with things like predictive policing. And we've seen how, from a policy perspective, that's failed in the past. And I don't see anything in this algorithm that would make it immune from those types of failures. That was the nature of the critique from Emily Bender. She's a professor of linguistics at the University of Washington, and they mention her in the story. She wrote a series of tweets critiquing this algorithm, basically saying that you should target the underlying inequities instead of doing this type of predictive policing. And also, they're really focusing on a specific set of crimes that are more common among certain socioeconomic or demographic groups. And they're not focusing on things like white-collar crime, securities fraud, environmental crimes, you know? So the people... 

Dave Bittner: All that broken windows kind of stuff. 

Ben Yelin: Exactly. Exactly. 

Dave Bittner: Yeah. 

Ben Yelin: Which - that stuff certainly matters, and it certainly has implications for quality of life. I also - I think an algorithm like this, if it's going to be used by actual policymakers, could subject jurisdictions to legal problems because there were lawsuits against this previous algorithm within the Chicago Policing Department, basically that it was biased against people who were not actually perpetrators of crime. But they were subject to additional searches, surveillance, simply because they had been identified by this faulty tool. So it's certainly worthy of reading for academic purposes, if you know anything about building these types of algorithms. It is developed by an organization called - or a journal called Nature Human Behavior. It was published on June 30, but I certainly have my doubts as to the efficacy of something like this. 

Dave Bittner: Now, OK, so couple things. 

Ben Yelin: Yes. 

Dave Bittner: To what degree is this coming at the problem differently because they're going after blocks of real estate versus individuals? 

Ben Yelin: So they claim that location is more accurate than going after the profile of individuals. That kind of - and I don't know how to speak like a true statistician, but that kind of presents a correlation-causation problem because certain people live in certain areas. 

Dave Bittner: Right. 

Ben Yelin: And that's not always just an accident of who buys a house where. It's sometimes the result of very deliberate policies over the past 100 years that have forced certain individuals and demographic groups to live in particular areas that have become dangerous. 

Dave Bittner: Yeah. 

Ben Yelin: And, you know, this past couple of weeks, I had the opportunity to drive around different parts of Baltimore that I had not been to. You can see that neighborhood blight just kind of creates this never-ending cycle of violence and poverty, that you can't really separate the 1,000-foot square - or the 1,000-square-foot area from the actual people who live there. So I'm not sure how different a algorithm based on place is going to do from an algorithm based on personal characteristics, just because I think it's hard to separate those two. 

Dave Bittner: Yeah. Let me be the bad guy here, and... 

Ben Yelin: Easy role for you to play. 

Dave Bittner: (Laughter) It's just - it's the part I was born to play. I think every police officer, certainly, and probably you and I, just as citizens, know where the good parts of town are and the bad parts of town are, right? Like you said, you were driving through parts of Baltimore. And I'm guessing you - these are not - there are places in Baltimore - there are places here where we live where you'd be less likely to go take your kids on a walk because you just don't feel comfortable there, right? You don't feel like it's safe. 

Ben Yelin: Right. 

Dave Bittner: So given that that is a reality that - I think it's in our best interest to acknowledge it exists, the - and in the old days, I would imagine, you know, these - this sort of data was just gathered either by word of mouth or, you know, whatever. If someone's trying to automate that and increase the granularity of it, you know... 

Ben Yelin: Right. 

Dave Bittner: ...How small a block we're talking about, is the downside that there are all kinds of innocent people who are just, you know, getting on with their lives and doing their best that they can who would get caught up in this algorithm? 

Ben Yelin: Because they just happened to be in the wrong area. I mean, that's where this runs into the same problem as some of these previous algorithms is then you just happen to have characteristics that were suspicious to social scientists and law enforcement. And now you have spatial areas that are suspicious to social scientists and law enforcement. 

Dave Bittner: Let me ask you this. Would you be OK with this algorithm if it were being used to provide funding for social programs? 

Ben Yelin: That is a great question. 

Dave Bittner: Smarty-pants. 

Ben Yelin: I know. You might have gotten me there. 

Dave Bittner: (Laughter). 

Ben Yelin: Well, yeah. First of all, I think that's different because... 

Dave Bittner: OK. 

Ben Yelin: ...And there really is a true distinction here. You're not depriving people of their personal freedom or... 

Dave Bittner: Right. Good point. 

Ben Yelin: ...Overpolicing them and harassing them... 

Dave Bittner: Right. 

Ben Yelin: ...On the basis of social services. 

Dave Bittner: Sure. 

Ben Yelin: And all of the models that we use to dole out social services to localities, states, is based on some type of spatial analysis. Really, that's just the United States census - how many people live there. But... 

Dave Bittner: Yeah. 

Ben Yelin: ...I mean, we do all different types of economic surveys to figure where the most - the greatest amount of economic need is. And so that's kind of already done to an extent. But I think that's different than predictive policing because there are just a lot of implications that come with predictive policing. It's not just the tangible effects of, you're falsely arrested because you happen to be in the wrong area. It's also the intangible effects of always feeling like you're under some type of increased suspicion where even if your civil liberties aren't directly being curtailed, you kind of feel like you're always under a watchful eye just because of where you happen to live. 

Dave Bittner: Yeah. 

Ben Yelin: So again, I think that's a problem that's inherent to any predictive policing algorithm. This one, to its credit, these researchers are trying a different model by recognizing that previous intelligence-based algorithms produce limited insight into the social system of crime. While they might help with criminal surveillance, they also introduce certain types of systemic biases. And it's good that they're recognizing that. I'm just not sure that this solution is the panacea to the problem of predictive policing algorithms. I don't think there is a panacea to the problem of predictive policing algorithms. I think it's kind of a doomed enterprise. It's one of those that until proven otherwise, I will always stick my skeptical eye on these types of studies just because you really can't predict individual crimes before they happen with any degree of certainty without putting an undue, watchful eye of suspicion on either a geographic area or a demographic group. And I think that's kind of fundamentally unfair and not the role of social scientists who are nerdy enough to develop an algorithm, in my view. 

Dave Bittner: But you can predict, you know, neighborhoods that are likely to have more crime than others - right? - based on history. 

Ben Yelin: Yes, you can. But the implications of that and whether that's going to be used by law enforcement, we've already shown how that can introduce biases that unfairly enrapture individuals. Maybe that's not going to happen when you're doing it by 1,000 square foot segments of a city. But I can certainly envision a situation where it would end up capturing some of those people just by being in the wrong place at the wrong time. 

Dave Bittner: So is this a case of tread lightly with extreme care? In other words, don't go in willy-nilly with something like - I hesitate to throw out a tool like this at all because I think, you know, using history to predict what may happen somewhere is a useful method, right? But I also recognize the point you're making that we have a history of being unfair to certain groups - right? - in our policing. So (laughter)... 

Ben Yelin: Yes. I would say a couple of things. 

Dave Bittner: So maybe acknowledge that as well. 

Ben Yelin: Maybe, one, I'm being too harsh on the social scientists involved here... 

Dave Bittner: Yeah. 

Ben Yelin: ...Because I think simply doing a social science study to develop this algorithm might be useful if they can actually prove that, with 90% accuracy, they can predict property and violent crimes. I am dubious that they could actually - that actually could be something replicated in a majority of American cities... 

Dave Bittner: Yeah. 

Ben Yelin: ...Particularly areas where things aren't so neatly divided in 1,000 foot - 1,000 square inch geographic spatial segments. But I think there's nothing wrong in trying to do this as kind of an experiment in social science tools that can help solve crimes. We have a major violent crime problem in this country right now. And I think anything - any idea that's introduced to try to alleviate that problem is welcome. I'm just wary of law enforcement making the same mistake they did in 2012 in trying to purchase this algorithm, use it through some sort of software to actually do predictive policing. That's where the negative consequences come in. 

Dave Bittner: Yeah. Yeah. 

Ben Yelin: So I think it's useful as an academic tool if it can be replicated, if it's something that's peer reviewed and it's not just they stumbled upon a 90% accuracy rate because of some type of correlation-causation mishap. But when it's actually used by police departments, that's where you face this potential for harassment, abuse and over-surveillance. And that's something that I would be wary of. 

Dave Bittner: Yeah. All right. Well, we will have a link to that story in the show notes. My story this week comes from the Financial Times. And it's titled "Japanese Court Ruling Poised to Make Big Tech Open Up on Algorithms." I thought this is a fascinating story just from the story itself, but also kind of the international implications that - I think, sometimes, it's easy for us to forget that there are a whole lot of other countries out there (laughter). 

Ben Yelin: There are? I wasn't aware of that. 

Dave Bittner: (Laughter) Right. And the things that they do and just the way that their laws and policies are set up can affect us here. And this is, I think, an example of this. At issue here is there was a court in Tokyo who ruled in favor of a restaurant chain. And it's Hanryumuru - Hanryumura. I think I got that right. 

Ben Yelin: Korean barbecue, I believe. 

Dave Bittner: Yep. Yep. 

Ben Yelin: Yeah. 

Dave Bittner: And the owner of this chain brought an antitrust case against kakaku.com, which is a restaurant review platform - evidently, the largest restaurant review platform in Japan. 

Ben Yelin: The Japanese Yelp, basically. 

Dave Bittner: There you go. 

Ben Yelin: Yeah. 

Dave Bittner: And this restaurant chain had argued that they had altered the way that user scores were tallied and that it was hurting the sales in their restaurants. And sure enough, the Japanese courts ruled against the website. And they had to pay just over a quarter million dollars in damages for abuse of superior bargaining position. Not a term of art I've heard in the U.S. legal system, no? 

Ben Yelin: Not one I've heard either, but... 

Dave Bittner: Which is another thing I find fascinating. 

Ben Yelin: Right. 

Dave Bittner: You know, when you sort of dip your toe into, what are some of the things that other nations have in place in their legal system? 

Ben Yelin: Right. 

Dave Bittner: So what's at issue here is that this may make some of these big tech companies open up their algorithms and not be able to consider them to be trade secrets anymore. 

Ben Yelin: There are a lot of nervous people in Silicon Valley right now. 

Dave Bittner: Right. Right. And so Japan has their own Fair Trade Commission, which is their antitrust regulator. And they're saying that this could do just that. It could make them open up the algorithms to make sure that they're treating different organizations fairly. And I think it's interesting to consider the broad effects that could have because if - you know, if they're - if they open them up for viewing in Japan, that means everybody else is going to have a look at them, too. What do you make of this, Ben? 

Ben Yelin: Right. So these would be public proceedings in Japan. And theoretically, if some of the big guys are targeted - and they mention in this article Facebook, Amazon and Google. If they lose in Japanese court, they might be forced to reveal some of their trade secrets. Our courts are far more protective of economic information. Critical proprietary economic information, like trade secrets, is just part of - more part of our legal culture. So that can generally - that's only included in a legal proceeding if it's absolutely necessary to adjudicate that proceeding. And even then, there are ways of keeping trade secrets out of court. I don't know enough about the Japanese legal system to know how different it is in terms of these types of - this type of data making it into a legal proceeding. 

Ben Yelin: But it seems like with a statute that passed there recently - and they mention a different one in the - so that statute was the Act on Improving Transparency and Fairness of Digital Platforms. There's a similar statute in the European Union, the Platform to Business Regulation, which came into effect in 2020. We now have a couple legal regimes that are set up that might force these Big Tech companies to reveal at least portions of their algorithms. And that's really terrifying to them because they develop the same algorithms for services that are used all across the world. 

Dave Bittner: Right. 

Ben Yelin: While they might be protected in U.S. courts, once this is revealed in a public proceeding in a Japanese court or a European court of justice, that information gets out there, even though it is a trade secret. For most normal people, it's insignificant. You know, if I saw a sheet of paper that had a portion of an algorithm on it, it would look like Japanese to me, literally. 

Dave Bittner: Might as well be - (laughing) exactly. Right, right. 

Ben Yelin: But, you know, imagine if you're working for a competitor, and that gives you an idea into how to improve your own algorithm. And it all came out of a legal proceeding. So I think our U.S. courts are just far more careful about protecting that proprietary information. But this is a groundbreaking case from that perspective. And it might - maybe this is the intended effect - force these companies to shore up algorithms to protect themselves from being sued in a European court, in a Japanese court, lest they have to reveal some of their extremely valuable trade secrets. 

Dave Bittner: Yeah. 

Ben Yelin: And maybe that can be a valuable incentive structure. 

Dave Bittner: Could be. We've heard, I know, some stories - we've heard of, you know, some of these - I think it was Facebook, some of their engineers saying they weren't 100% sure how their algorithms worked. (Laughter) You know, like... 

Ben Yelin: Right. 

Dave Bittner: ...As things get grafted on over the years and they're complicated things, and we've certainly seen plenty of examples where - you think about the example when Microsoft released their Tay artificial intelligence on the world, and it spun off in a direction they had not anticipated. 

Ben Yelin: Right. 

Dave Bittner: So, you know, algorithms can go in funny places and funny directions. Is it possible that these companies could either spin up a custom version of the algorithm just for Japan and alter it for the rest of the world, or claim that they're doing so? 

Ben Yelin: From a legal perspective, that might help. You would probably have a better idea than me how difficult that would be from a technological perspective. 

Dave Bittner: Yeah. 

Ben Yelin: I mean, you probably wouldn't be able to have the same type of functionality for whatever app or software you're selling if you were to have a watered down algorithm that didn't have the greatest trade secrets. I mean, I assume that applications work as well as they do because the algorithms are complicated... 

Dave Bittner: Yeah. 

Ben Yelin: ...Extensive. And coming up with a shell of an algorithm to please a certain market, you - that would cost your bottom line, as well. So you'd have to weigh the costs and benefits of doing that. I don't think it would be an easy decision, but it's certainly a huge deal to have to reveal trade secrets in court. 

Dave Bittner: Could we imagine them pulling out of these markets? 

Ben Yelin: I mean, that would be an even bigger hit to your bottom line. The Japanese market, particularly for anything in the world of electronics or smartphone applications, I mean, that's one of the world's largest markets. 

Dave Bittner: Yeah. 

Ben Yelin: So it just would not be worth abandoning an entire market just because your - one company in one case thus far has been forced to reveal a portion of its algorithm. If they use this statute to start going after the big guys and Google starts to get exposed both in terms of its liability and it's being forced to reveal some of its trade secrets, that might be a last resort would be to pull the plug on that market. I presume if that were to happen, then Japanese lawmakers and legislators might try to ameliorate that problem. 

Dave Bittner: Oh, right, right. 

Ben Yelin: Certainly, we've seen, in other instances, big tech companies threaten to leave countries because of some type of regulatory regime. Very rarely do they actually do that because you're just giving up a large portion of your potential business. So oftentimes, the threats are not exactly followed through. But yeah, I mean, it's certainly possible that they could threaten to leave the market and force some type of change among Japanese regulators to expose them or to decrease the exposure of potential trade secrets. 

Dave Bittner: All right. Well, it's certainly one to keep an eye on and kind of fascinating, you know, this global village in which we find ourselves, right? 

Ben Yelin: It's a really fascinating story. That was a good find. 

Dave Bittner: Yeah. Yeah. All right. Well, we will have a link to that in the show notes. Of course, we would love to hear from you. If you have a topic you'd like us to discuss here on our show, you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, you recently had a conversation with Matt Kent. He is the competition policy advocate at Public Citizen. And you all were discussing the American Innovation and Online Choice Act. Here's Ben and Matt Kent. 

Matt Kent: A lot of the work I do is focused on competition in the digital marketplace. And then there's a lot of sort of overlap with how that affects folks' privacy and security online. You know, I think, looking at this big picture, you know, we have a situation where online digital platforms have become extremely concentrated, right? Like, it's always been - there's always been big companies. But in the past decade, things have really narrowed down. It's - we're basically looking at four major companies dominating parts of sort of, like, the everyday digital experience - right? - search, shopping. Not only do they dominate those experiences, but they actively undercut rival firms through various anti-competitive tactics. So the result is really an internet where you don't have realistic choices or the opportunity to walk away from a platform whose business model is basically premised on data surveillance, surveillance advertising, you being the product, right? 

Matt Kent: Like, we're now in a situation where companies that have different business models, so maybe a subscription-based model, something based on privacy, those companies exist, but they can't reach consumers because they're caught sort of in this series of anti-competitive traps. So I think directly to your question, the lack of competition in the marketplace leads to, you know, abuses on the - from the dominant platforms, but also, like, we can't vote with our feet - right? - as... 

Ben Yelin: Right. 

Matt Kent: ...Digital consumers. Like, I could move over to some of the privacy, you know, first platforms. There's a few search engines out there or social media platforms, but it's - like, nobody else is doing it. There's no interoperability. The network effects don't work in my favor, so I can't do it. I'm trapped as a user. 

Ben Yelin: Right. You're sitting on an island with a bunch of other nerds who care about privacy and security, while millions of users, the people you actually want to interact with, are on one of the four major platforms. What are - just based on your research and knowledge, like, what are the specific provisions that lead to these anti-competitive practices? So are there loopholes in statutes? Like, why does this problem exist? 

Matt Kent: So it's really, like, an instance, at least from a competition standpoint, of the laws not keeping up with sort of where we are in terms of the digital ecosystem right now, right? But, you know, if you want to pursue a competition lawsuit or enforcement action against any of the big four - Amazon, Apple, Meta, Google - you know, you're going to have to be using the Sherman Act or the Clayton Act, which don't really - you know, the text of the statute is very vague. But the jurisprudence over the years isn't fit for addressing the harms that I talked about, sort of the domination of the online user experience and the lack of competition. There's also this problematic aspect of the consumer welfare standard. And we can talk more about it, but essentially, like, the too long didn't read of it is, you know, since the '70s, since the Chicago School, Bork sort of law and economics movement, antitrust litigation is mostly premised - not all, but very much premised - on the consumer welfare standard, which essentially means what's the cost to consumers, right? So if you're trying to litigate an antitrust action against Big Tech, I mean, they have a very good defense, right? Like, if you look at the consumer welfare standard, these products don't cost anybody anything. 

Ben Yelin: Right. 

Matt Kent: Right. 

Ben Yelin: Facebook is awesome. I can communicate with all my high school friends for free. Yeah. 

Matt Kent: Right. So there's a holdup in the sense that Congress hasn't acted to update the laws, which is something we're working on very intensely right now at Public Citizen and other organizations. But also the courts have really gone in the wrong direction. 

Ben Yelin: So on that point, can you describe efforts ongoing in Congress? I know there are a couple of specific pieces of legislation that have been proposed. Can you just give us an overview of those? 

Matt Kent: So, essentially, about two years ago, there was a big push for Big Tech accountability using the antitrust laws. And this is a bipartisan thing. Democrats and Republicans have slightly, I think, different reasons for opposing the big tech companies, but they coalesced around a series of bills in the House Judiciary Committee. There were, like - this was two years ago. There were wall-to-wall hearings, all kinds of legislative activity. It was really cool. Like, it was actually what congressional committees are supposed to do, right? They hauled in the tech executives. Like, Jeff Bezos had to answer questions in front of everybody. Like, it was great. And they produced, like, a very thorough over 1,000-page report on the practices, the anti-competitive practices, of the big tech companies. 

Matt Kent: So out of that effort came a package of six bills aimed at Big Tech accountability through antitrust competition. A lot has happened since those bills passed out of House judiciary, ups and downs. But where we are now is two of those bills have really sort of taken the momentum, and everything is largely settled on the Senate versions. So the two bills are the American Innovation and Choice Online Act. That's the Klobuchar-Grassley bill. You'll hear it referred to as the self-preferencing bill. But there's also the Open App Marketplace Act, and that's from Blumenthal and Blackburn. I just want to pause and say that the pairings on these co-sponsorships are just wild stuff. 

Ben Yelin: Yeah. I mean, if you don't know these senators the way we do, they are ideological opposites in each case. 

Matt Kent: Yes. Thank you. Thank you - important context. But, I mean, there's a lot of that throughout, sort of this mashup of unlikely bedfellows throughout the push on these bills. So anyway, I'm going to call them AICO and OAMA or the self-preferencing bill and the app store bill interchangeably. But those are the ones that really we're working on right now that are - like, it's do-or-die time before August. Both bills have passed Senate judiciary by really large margins, and they're basically ready for the floor. So the battle right now is to get them there before August. You know, it's been wild. There's, like, a crazy asymmetry between sort of the spending and lobbying capacity of Big Tech, the four companies combined, and then sort of on the other side, public interest groups, civil society groups, and the medium to small businesses that are really, like, taking the hit in the current model. So that - yeah, I mean, that's where we stand. 

Ben Yelin: What is at least the ostensible root of the opposition among big tech companies? Like, obviously they don't want to be broken up, but they can't directly say that to members of Congress. What's their argument that these pieces of legislation would be detrimental to the public? 

Matt Kent: It's a little depressing in the sense that, like - at least from my perspective - but these bills don't even, like - they're not even breakup bills, right? Like, they're - out of that bigger package that I mentioned before that came out of House judiciary, there were some, like, really ideal bills that would stop mergers, you know, presumptively halt any mergers from big tech companies that - others that would give the FTC power - although they arguably have the power now - to break up these tech giants, to order spinoffs. You know, think of Facebook, Instagram, sort of the big picture, like, whoa, shock the world, Sherman Act stuff. All of that is not in the conversation. Like, right now, we're talking about bills that are super important and I think would go a long way to chip away at Big Tech's sort of dominant stranglehold because self-preferencing really is at the foundation of that. Just to sort of put it in context, the ones we're dealing with now aren't even, like, top of the class, total destruction of Big Tech's model, although they would very much affect it. 

Matt Kent: So I mean, their arguments against it are, I think - at the risk of repeating opponents' arguments - they've - and they're - they've been tough to pin down over, you know, the months at this point. But they center largely around a concern - a bunch of, like, hypotheticals that could possibly spin out of the litigation that would come from these bills. So another, like, small point that I hope listeners of your podcast, as legal - big-time legal-heads would appreciate, but, like, these bills don't even - they don't establish, like, a regulatory compliance regime. They establish a series of legal standards that would be enforced by the courts. You know, DOJ, FTC, state AGs would be able to institute cases, and then courts would decide and that's how enforcement would work. There's no private right of action in the Klobuchar Senate bill. So we're really talking about sort of government actors enforcing this bill. So anyway, they... 

Ben Yelin: So for the non-legal people, that means you or I, as consumers, wouldn't have the ability to sue these tech companies directly for their anti-competitive practices. It would have to be instituted by the AGs or the DOJ. 

Matt Kent: That's right. That's right. But anyway, so from that point, a lot of the arguments against the bill are, well, we're concerned that a sort of wild-eyed state AG would pick up a case that touches on content moderation or privacy and security. And through a series of bad decisions, you know - that's - that part is sort of murky in Big Tech's argument about exactly how these - the legal arguments and how this would bear out. But they're saying the whole thing would whiplash and, you know, we'd no longer be able to moderate content. You know, we would be scared because of litigation, which is sort of a laughable argument when you think about the resources that these companies have at their disposal. 

Matt Kent: There's also arguments that the bills would affect national security negatively. That has died down a little bit. You know, if you look at the text of the self-preferencing bill, there are many, many carve-outs regarding China, companies owned by China. I would say that it is well-covered in both the text of the bill and sort of the affirmative offenses - affirmative defenses available to the companies that they won't have to, like, give over sensitive data to China or Chinese-owned companies. That was, like, a big part, I think, of the concern at committee, which is why a lot of these changes were made. That has died away a little bit when it became pretty clear that TikTok would be a covered entity under these bills. So they'd essentially be prohibited from doing the same practices as sort of the Big Four ostensibly American companies. Although the question... 

Ben Yelin: Right. 

Matt Kent: ...As to whether they act in American interests all the time is an open one. 

Ben Yelin: Right. Sure is. As - I know you to be a good prognosticator of what happens in Congress. What do you see as the major obstacles on the Senate floor, and then going back to the House side? And where do you see this going over the next several months? 

Matt Kent: Oh. Oh, Ben, if I knew, I'd be a much happier person right now. But - so what... 

Ben Yelin: (Laughter) This keeps you up at night. 

Matt Kent: I mean, this - yeah, this is, like, the No. 1 thing, you know, I'm working on right now, and I would say the issue - it's sort of interesting. The issue is not whether they'd pass if put to a vote, because they would. They have, you know, at least 20 Republicans who would go and a bulk of Democrats. Like, I don't think there's any question. If forced to vote on this bill, like, looking at the polls and where Big Tech accountability stands, I think any sane chief of staff or member of Congress would understand that they need to support these bills if the vote is there. 

Matt Kent: Now, the big question is convincing leadership to put these votes on the floor because there are some in the Democratic caucus who are concerned that, in their words, the bills would endanger their chances at the midterm - being forced to vote on the bill. Now, you know, we argue that this would help your midterm chances by showing voters that you're actually doing something about Big Tech accountability. I think, without, you know, naming names, some members of Congress are concerned that if they're forced to take this vote and vote in favor, they would lose significant fundraising support from big tech companies or consulting firms or just the whole sort of ecosystem. 

Ben Yelin: Right. And many of these, especially in the House, you have a lot of powerful representatives who represent maybe Silicon Valley districts. And these are their constituents. 

Matt Kent: Yeah. That's right. 

Ben Yelin: Certainly, we know that on House judiciary. So let's say these two pieces of legislation pass. In terms of your work, what do you see as the next frontier in fighting Big Tech monopolies? Is it revisiting those House judiciary bills or is there something else entirely that's on the frontier but that hasn't really been taken up by policymakers yet? 

Matt Kent: I think things shift a little bit once we figure out - I mean, a lot depends on what happens. But either way, things shift a little bit to enforcement and an area of the federal government that I very much am interested in in terms of, like, agency enforcement, right? Like - so part of the bills would be the FTC and the DOJ would then have to create guidelines as to how this is enforced. There's a lot of work around that. That has to be right. And Big Tech will - you know, if put in that fight, will be, you know, lobbying those agencies just as hard as they're lobbying Congress right now. So I think the fight is one of implementation if these pass. If they don't pass, man, I don't know. We'll, I guess, go back to the drawing board. And, you know, there are - you know, there are lawsuits going now. Like, DOJ and FTC are moving on Big Tech accountability. 

Matt Kent: These - you know, antitrust litigation is notoriously long. So it could be a long slog. But I think some of the focus, the advocacy focus, moves back to sort of the agency side. Although, I mean, I don't know how interested I'm - under this, I'm assuming that the Republicans take the House. So that just sort of scrambles everything. And I don't - indications are that they will not prioritize these issues, and that's coming from Republicans themselves. So I just - the legislative outlook gets a lot dimmer after August. 

Dave Bittner: All right. Interesting conversation - really, really good stuff. 

Ben Yelin: I love when I get to ask you, what did you think of the conversation? 

Dave Bittner: (Laughter) Right. 

Ben Yelin: It's rare I get that opportunity. 

Dave Bittner: That's right. Ben, you should be ashamed of yourself. 

Ben Yelin: I know. I'm pulling a Dave Bittner. 

Dave Bittner: This will be your last week on "Caveat." 

(LAUGHTER) 

Dave Bittner: No. I really thought it was an interesting conversation. And, boy, Matt was a great guest. You know, it's fascinating to me just the power that these big platforms wield. And how do you rein that in, you know, while still allowing for, you know, the things that make this nation great, right? - the innovation and all that sort of - which we want. 

Ben Yelin: Yeah. 

Dave Bittner: But something, you know, you and I have talked about many times - it troubles me the amount of consolidation that we have now, where anything good or desirable gets bought up and sucked into a big national company, you know? And the thing I always think about are cable companies, right? 

Ben Yelin: Right. 

Dave Bittner: We used to have local cable companies. Then they got bought up by regional companies. Then they got bought up by big national companies. Now there's a handful of big, national... 

Ben Yelin: You have one option. 

Dave Bittner: ...Companies. Right. So you're lucky if you live in a cable duopoly, you know, to get your internet and your cable or whatever. And I just don't think ultimately that's good for us. 

Ben Yelin: What's interesting to me is I think we know instinctively that that's not good for the consumer economically. It's going to decrease our chances to get the best product for the most competitive price because big tech companies, if they're in a monopoly - you know, they don't have to worry about competition. 

Dave Bittner: Right. 

Ben Yelin: So they can screw us over. I hadn't really thought about it - and Matt introduced this both in his work and in the interview - about the impact that would have on privacy and security just because without the threat of reasonable competition, there's less of an incentive for these companies to focus on security for their users and privacy for their users. There are certainly reasons for them to focus on it. Bad PR is one of them. And there is certainly a minimal level of competition. But when we have two companies that operate app stores, for example, and you don't have to worry about an enterprising company that's going to come in and revolutionize security, you're not going to have as much incentive to secure your application, secure your software as you would if you were in a truly competitive market. 

Dave Bittner: Yeah. 

Ben Yelin: And that ends up being bad for consumers. 

Dave Bittner: Yeah. 

Ben Yelin: That's true in every industry. I mean, the more monopolization there is, the worse off the average consumer ends up. And I just - we're seeing how that manifests itself in the tech world. And it seems like Congress has taken notice, and we have this rare opportunity over the next couple of months to actually get something done. It's not going to solve the problem - we're still going to have this problem of Big Tech monopolization - but at least to start to eat away at the problem and protect consumers in certain circumstances. 

Dave Bittner: Yeah. Boy, Ben, it's so easy to be cynical, isn't it... 

Ben Yelin: It sure is. 

Dave Bittner: ...(Laughter) About anything making its way through Congress. I'm also reminded of the classic Lily Tomlin sketch where she was the telephone operator, and she said, I'm sorry, ma'am, we're the telephone company; we don't have to care (laughter). 

Ben Yelin: Yeah. We don't care. We don't have to. 

Dave Bittner: Yeah (laughter). 

Ben Yelin: Yank. Just lost Peoria. Yeah. 

Dave Bittner: Right. Right. 

Ben Yelin: A good YouTube clip of an old "SNL" skit, if you're interested. 

Dave Bittner: All right. Well, again, our thanks to Matt Kent for joining us. He is from Public Citizen, and we do appreciate him taking the time for us. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.