Caveat 11.18.21
Ep 102 | 11.18.21

Facebook whistleblower testimony reveals gaps in legal system.

Transcript

Jenny Lee: And, I think, until we understand the root cause of the issue and also define what is the issue we're trying to address, we're not going to really be able to do a good job of crafting a solution.

Dave Bittner: Hello, everyone. And welcome to Caveat, the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bitner. And joining me is my co-host Ben Yelin from the University of Maryland's Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben looks at the Justice Department's attempts to combat ransomware. I've got the story of pending legislation targeting social media algorithms. And later in the show, my conversation with Jenny Lee from Arent Fox LLP to discuss the recent Facebook whistleblower testimony. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please, contact your attorney. All right. Ben, let's jump into some stories here. Why don't you start things off for us. 

Ben Yelin: So we have big news coming from the Department of Justice by way of The New York Times. In an article posted by Katie Benner and Nicole Perlroth at The New York Times, we find out that the Justice Department filed charges against two alleged perpetrators of ransomware attacks. So one of them is against a Russian individual named Yevgeniy Polyanin, who is accused of deploying the ransomware known as REvil or REvil. REvil, we say. REvil, we say. OK. 

Dave Bittner: Yeah. (Laughter) But that doesn't mean it's correct. 

Ben Yelin: Yeah. 

Dave Bittner: (Laughter) Everyone has their own version, you know? So... 

Ben Yelin: It's kind of like gif or gif. 

Dave Bittner: Yes. Yeah. 

Ben Yelin: Yeah. It's about your personality which one you use. 

Dave Bittner: Right. Right. It's a Rorschach test (laughter). 

Ben Yelin: So this was for a ransomware attack that took place in - largely in businesses and government institutions in Texas in 2019. And then the second indictment was against a Ukrainian national, Yaroslav Vasinskyi, who is alleged to have conducted multiple ransomware attacks, including the one on the technology company Kaseya. That one I know particularly well because Kaseya runs the software for a contractor that works closely with a small city in Maryland called Leonardtown. Because of that ransomware attack, the small city of Leonardtown suffered significant kinetic effects, cost a lot of money to rebuild their system. 

Dave Bittner: Oh, wow. 

Ben Yelin: And that was just one tiny example of the impacts of that ransomware attack. So these are two, you know, pretty high-profile indictments and a sign that the Justice Department is taking ransomware seriously. They're doing everything they can to try and discourage perpetrators around the globe from participating in these types of ransomware attacks. 

Dave Bittner: OK. 

Ben Yelin: Here's the drawback, though. The question is, how do we actually get to these people and punish them? 

Dave Bittner: Right. 

Ben Yelin: A major issue is that one of these perpetrators is a Russian national. Let's just say a pretty large percentage of people known to have perpetrated ransomware attacks are Russian nationals. 

Dave Bittner: Yep. 

Ben Yelin: And unfortunately for us, we don't have a working extradition treaty with Russia. I was doing a little half-assed internet research, as we say... 

Dave Bittner: (Laughter) 

Ben Yelin: ...On this subject. There was actually a U.S.-Russian extradition treaty signed in 1893 with Czarist Russia. But then at some point in the 1970s, through a Justice Department memo - you know, this was mid-Cold War - it seems like we decided that that extradition treaty wasn't actually in place. And we know that now because we've had these high-profile instances of people who have been charged with crimes in the United States hiding out in Russia - the most notorious example being Edward Snowden, but then also, you know, some of the people who were charged as part of the Mueller investigation for misinformation against U.S. nationals. So that's a significant problem. It's hard for us to gain jurisdiction and bring these cybercriminals to justice. For this Ukrainian guy, luckily for us, he set foot in Poland. 

Dave Bittner: Yeah. 

Ben Yelin: We do have an extradition treaty with Poland. So he was arrested there. He's going to be extradited to the United States and will face trial in our U.S. court system. But the problem with a lot of these other perpetrators is, you know, we may have all the evidence in the world to charge them - I'm sure we could put together a really compelling trial, talk about the impact of ransomware on, you know, our local governments, on our businesses, on the meatpacking industry. You know, I'm sure we could put together a very good case. But it's very difficult to bring these alleged criminals under our jurisdiction. 

Dave Bittner: Yeah. And, you know, it seems as though, kind of like you mentioned with that one gentleman, they get a little lazy. They decide they want to vacation somewhere outside of their mother country. And that's where we... 

Ben Yelin: Bad idea jeans, as they say. 

Dave Bittner: (Laughter) And that tends to be where we nab them. So what other tools does the Department of Justice have available to them to put some pressure on them? Are there any other places that they can come at them? 

Ben Yelin: So there are, really, two things that can be done here. The first is try and attack the financial angle, usually through the Department of Treasury, and seize the money gained via ransomware attacks. So in this Justice Department announcement, they announced that from this alleged Russian perpetrator, they seized $6.1 million of assets allegedly gained through this ransomware attack and through other cybercrimes. That sounds well and good. And I think that is a very effective tool. You know, you can track the financial transactions and, you know, put a little - put your - the big hand of the U.S. government in some of these bank accounts and extract this money. 

Dave Bittner: Right. 

Ben Yelin: That's not always going to work. You know, that might not be enough of a disincentive if you're in Russia because let's say they catch only, you know, 50% of people they're able to seize the assets. That leaves, you know, the rest of the 50% who are getting off scot-free after... 

Dave Bittner: Yeah. 

Ben Yelin: ...Committing a ransomware attack. 

Dave Bittner: And what if I grab 6 million of the 10 million that you've stolen? You're still doing pretty well. 

(LAUGHTER) 

Ben Yelin: Yeah, exactly. 

Dave Bittner: Right. 

Ben Yelin: You're still making out like a bandit. That's not... 

Dave Bittner: Yeah. 

Ben Yelin: ...Enough of a disincentive. 

Dave Bittner: Right, right. 

Ben Yelin: The other avenue is through diplomacy. And we've tried to do this. And quite frankly, I'm not exactly thrilled or excited about the prospects of some sort of new extradition treaty with Russia that specifically covers ransomware attacks or, you know, some type of bilateral agreement where the Russian government would actually crack down on these malicious actors. 

Ben Yelin: We've now seen through two U.S. presidential administrations the Russian government has been - under Vladimir Putin has been quite resistant to going after cybercriminals. President Biden said in a recent meeting with - a recent virtual meeting with Vladimir Putin that the U.S. is taking ransomware attacks very seriously. We're going to go after your criminals. You know, so it's kind of a shot across the bow. And, you know, that so far has not brought Russia to the table to come up with some sort of agreement. 

Dave Bittner: Yeah. 

Ben Yelin: And, you know, until they start taking these attacks seriously or until it becomes enough of a diplomatic concern, maybe because of sanctions that are imposed or other potential punishments, I don't see why they would, you know, change their trajectory and not really doing anything about ransomware attacks. 

Dave Bittner: Yeah. I mean, you know, there was hope after the Colonial Pipeline incident and where we sent some strong diplomatic signals that it was time to knock it off, and we saw some of these ransomware gangs say, we're closing shop; we're shutting down. And I think there was hope that maybe that was in response to that, that maybe they got the message from the powers that be in Russia to knock it off or at least lay low for a while. 

Dave Bittner: Seems now like they were just laying low for a while. We've seen some of these groups - or they'll go quiet, and then they'll spin up again, or they'll come back under a different name. So, yeah, I think to your point, don't get too optimistic that these folks are going away. I mean, what other things can we - can - diplomatically, do we threaten sanctions? What other things do we have to hold against Russia proper? 

Ben Yelin: I mean, the more powerful the country, the less powerful our diplomatic tools are going to be. Sanctions is the most powerful one. You know, you're not going to see - you can see, you know, some offensive cyber operations against Russia, potentially, although that, you know, can have some second-order effects that aren't going to be great for the United States, and that would also be an escalation. 

Dave Bittner: Right. 

Ben Yelin: Otherwise, it's isolating them diplomatically and imposing sanctions. We have imposed sanctions on Russia for a variety of things. So far, you know, at least as it relates to this particular issue, it doesn't seem to be enough of a disincentive. 

Ben Yelin: You know, so I think because there's this lack of hope about diplomacy with the Russian government, we have to look at all these other potential avenues, including seizing assets and broadening enforcement actions so you have people on the ground in a bunch of different countries where we do have extradition treaties working with law enforcement in those countries and trying to catch cybercriminals who happen to cross into their jurisdiction. And they note in this article that we've actually had a good deal of success in doing that. We've made arrests in a number of countries recently. I mentioned Poland, but also in South Korea and a couple of other countries where we do have strong diplomatic relationships. 

Dave Bittner: Yeah. 

Ben Yelin: One of the intelligence analysts at a cybersecurity firm who was quoted in this said we need, quote, "a sustained cooperative law enforcement operation" to make it more expensive to conduct ransomware attacks. That's what it's about. It's about scaring potential cybercriminals. 

Ben Yelin: As of right now, there are preliminary indications that at least as it relates to these particular cybercriminals, they've gone quiet. People who monitor these forums, who are scouring the dark web looking for activity from REvil and DarkSide and other bad actors have noticed that they've gone dark. So, you know, they'll probably, as you say, pop back up in different forms. But at least in the short term, you know, that at least has some positive effect. 

Dave Bittner: Yeah, yeah. It's - I - you know, I would love to be more optimistic about this, but it's hard to be given just what we see. You know, it doesn't seem to - doesn't seem to be much stemming the tide. I guess it's - I don't know - you know, bailing water on the Titanic or something like that. I don't know. 

Ben Yelin: Yeah, we need a good Titanic metaphor for this. 

Dave Bittner: (Laughter) Yeah, yeah. 

Ben Yelin: I mean, the real tragedy of it all is that, you know, these are relatively sophisticated actors, and the people they end up hurting are small businesses, mom and pop shops, people who rely on their local governments for services. 

Dave Bittner: Yeah. 

Ben Yelin: You know, they - there was a ransomware attack that hit a police department in Fulton, N.Y. That's a small town. I've been there. My in-laws are from that area. You know, they're not going to be particularly well-equipped to deal with these types of attacks. 

Dave Bittner: Right. 

Ben Yelin: So it does have a real-world impact. And it's good the Justice Department is taking this seriously. I hope that there's some mechanism where in the long term we can hold these cybercriminals accountable. 

Dave Bittner: Yeah. I mean, it's a good point, too, that, you know, it's the high-dollar-value ones that make headlines, but the, you know, the ones that are - I don't want to say nuisance attacks, but those ones targeting small businesses for smaller dollar amounts, those have continued. Those have not fallen off 'cause it's easy pickings. It's low-hanging fruit for many of these operators. And so we don't hear about those, but they're still happening and extremely costly for those small and medium-sized businesses as well. 

Ben Yelin: Yeah. I mean, it's like the difference in trying to rob, you know, a bank in Midtown Manhattan versus trying to rob a, you know, exurban convenience store. 

Dave Bittner: Right. 

Ben Yelin: You're - the winnings - the potential winnings are going to be different. But it is easier to attack those smaller entities that don't have the same kind of robust defenses. 

Dave Bittner: Yeah, absolutely. All right, well, we'll have a link to that story in the show notes, as always. 

Dave Bittner: My story this week - linking to a story from Mashable. This is written by Jack Morse. But been a lot of coverage of this all over the place. This is - this one's titled "Lawmakers Come for Facebook Algorithm with 'Filter Bubble' Bill." So we have a bipartisan group of lawmakers. They have put forward a bill that they're calling the Filter Bubble Transparency Act. 

Ben Yelin: Tragic lack of an acronym there. That's a missed opportunity. 

Dave Bittner: (Laughter) That's right. It's the (mumbling). 

Ben Yelin: Yeah, that just... 

Dave Bittner: Right. 

Ben Yelin: ...Does not come off the tongue well. 

Dave Bittner: Not on their game here. 

Ben Yelin: No. 

Dave Bittner: These lawmakers, yeah. And basically, what this comes down to is you've got a group of lawmakers - like we said, a bipartisan group - who are putting forward legislation that would make - that would require the big social media companies to give users the option of having their interactions with these social media companies to happen in a non-algorithmic way. In other words, have the option to switch off the algorithmic ranking system that they use to put things in front of you. 

Dave Bittner: So for example, Twitter already has what they refer to - I believe they call it a reverse chronological timeline. So basically, you can see stuff as it happens rather than what Twitter thinks will be interesting to you. 

Ben Yelin: It's crucial for live sporting events, by the way. 

Dave Bittner: Oh, is that right (laughter)? 

Ben Yelin: Yeah. I don't want to hear about what happened an hour ago. I want to - analysis of the last play. 

Dave Bittner: Right, OK. I'll take your word for it (laughter). 

Dave Bittner: But, you know, obviously, this is targeting Facebook. Or should I say Meta? 

Ben Yelin: (Laughter) Yeah. 

Dave Bittner: It's targeting them because I think they're the poster child for this sort of algorithmic presentation of channeling information to people. So what they're trying to do here is just give users the option to throw a switch and not have that information fed to them in that way. Of course, you know, for the providers, that's not a good thing because that's how they make their money... 

Ben Yelin: Right. 

Dave Bittner: ...By putting things in front of you and being able to place ads in front of you based on things they - that you find interesting. What do you make of this, Ben? First of all, the fact that this is coming from a truly bipartisan group of legislators - that's interesting. 

Ben Yelin: Yeah, it is interesting. 

Dave Bittner: It's noteworthy, right? 

Ben Yelin: Absolutely noteworthy. And it's - you know, knowing these members of Congress, this is not a group of individuals that agree on much. You're talking about a couple of very conservative members and a couple of very liberal, progressive members who put this bill together. 

Ben Yelin: You know, this proposal is limited in a number of ways. The main way is that it only applies to the big guys. So any company that has 500 or fewer employees or possesses data on fewer than 1 million people will not be covered under this bill. But, you know, I think that makes sense 'cause the targets are the Facebooks/Metas of the world. 

Dave Bittner: Yeah. 

Ben Yelin: And Twitter. You know, I'm wondering how much opposition Facebook/Meta will put up in opposition to this. I mean, they've signaled a willingness to consider regulation when it comes to things like Section 230. Obviously this would be a big hit to their business model. That's true. 

Dave Bittner: Yeah. 

Ben Yelin: What I'm wondering is - you're giving people an opt-out. How many people are going to miss the algorithmic content when it's gone? I sort of think... 

Dave Bittner: Yeah. 

Ben Yelin: ...You know, there are going to be some very privacy-conscious people, the type of people who disable location services on their phone, you know, the type of people, like our listeners and us, who are cautious about this stuff. And then there are going to be other people who are like - you know what? - I kind of miss seeing that targeted advertisement. 

Dave Bittner: Well, yeah. This is a lot less fun. This is like, you know, going out to dinner and not being able to have dessert. 

Ben Yelin: Right. 

Dave Bittner: Right? 

Ben Yelin: I mean, we have become addicted to content that's very narrowly tailored to our interests and our buying habits. You know, have you ever tried to browse a website, you know, where they're - maybe you're using somebody else's computer, it doesn't have your cookies, and it just seems different? You know, I think that's something that many users, despite what they say or, you know, how they would be - how they would respond to an opinion poll, you know, they might actually respond - they might actually not opt out of this service. 

Dave Bittner: Yeah. I think that's right. I think, you know, when you look at Twitter, who has something like this already built in, I've seen many people say that when they switch over to the reverse-chronological version, magically and mystically, it switches itself back over time. 

Ben Yelin: Right. 

Dave Bittner: You know? 

Ben Yelin: You always have to re-check that box. 

Dave Bittner: Right. Right. And, you know, I would say, based on Facebook's past history of whether or not they tend to operate in good faith when it comes to these sorts of things, I would expect the same from them, unless there was an actual rule here that would prohibit them from doing so. 

Ben Yelin: Right. 

Dave Bittner: I think you're right. I think the thing that people like about Facebook, the itch that it scratches for people - part of that is the algorithmic stuff, so to turn it completely off, to have just an on/off switch, to not be able to dial it down or have any sort of granularity in dialing it in - I think you're right. A lot of people will try it out, and they'll say, this is no fun. And they'll turn it back on, and that's that. But I guess for the folks who want it, it's good to have it, right? 

Ben Yelin: Right. You know, that leads us to the question of how likely is it that this bill will actually become law? 

Dave Bittner: Yeah. 

Ben Yelin: As you know, and I've said this many times, the likeliness of any proposed piece of legislation becoming law in our polarized and largely dysfunctional Congress is pretty low, even though this does have bipartisan support. You know, this is something that is pretty ambitious, and with the lobbying power of Facebook/Meta, you know, this is - this seems to me like the type of thing that's going to get bottled up in committee. Get those Silicon Valley legislators in there, saying, let's hold our horses a little bit. So, you know, if I were a betting man, I would most likely say that this does not stand a realistic chance of passing. 

Dave Bittner: Yeah. 

Ben Yelin: Now, that's just for the short term. In the long run, you know, as Congress tries to take action to rein in these algorithms, this is the low-hanging fruit 'cause it would be voluntary, you know? There's - people would still have to opt out of it. So Facebook and Instagram and Twitter could still have algorithmic content, you know, with most people, as we say - as we've said, maybe not opting out. 

Dave Bittner: Right. 

Ben Yelin: And they can still make their money. You know, one other reason that the tech companies might not be as resistant to something like this is if the federal government regulates the algorithmic content like this, that would preempt state action. So prevent companies like Facebook from having to deal with 50 separate state regimes trying to regulate algorithmic content - if you're going to do it, it's - you know, from their perspective, it's just better to have one standard that comes from the federal government. 

Dave Bittner: Yeah. Yeah. You know, another thing I've seen in some of the commentary on this is some folks - oh, dare I say pedantically pooh-poohing the definition of algorithm... 

Ben Yelin: Ah, yes. 

Dave Bittner: ...That these - you know, those - these out-of-touch legislators don't know what algorithm means, and so this is meaningless, and they're using the term improperly, and everything online is an algorithm. And I just - I sigh ruefully and say, let's not miss the forest for the trees. 

Ben Yelin: Right, exactly. 

Dave Bittner: (Laughter) Like, some people can't help themselves. 

Ben Yelin: Yeah. I mean, we know what the target of this legislation is. 

Dave Bittner: Right. 

Ben Yelin: And we don't have to narrow down in a proper dictionary definition of algorithm to understand what these lawmakers are trying to do. 

Dave Bittner: Yeah. Yeah. Still, it gets my goat (laughter). 

Ben Yelin: I can see that. Yeah. 

Dave Bittner: All right, well, that would be a good opportunity for us to move on before my dander gets too... 

Ben Yelin: Let's cut it out. Yeah. 

Dave Bittner: Yeah, yeah. All right, well, we will have a link to all the stories in our show notes, of course, and we would love to hear from you. If you have something you would like us to cover, you can send us an email. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Jenny Lee. She is an attorney at Arent Fox LLP, and we discussed the recent Facebook whistleblower testimony - really interesting stuff. Here's my conversation with Jenny Lee. 

Jenny Lee: One of the most striking aspects about the issues in Haugen's complaints to the SEC - speaking from the perspective as a lawyer, given my chosen field here - is what those issues reveal to be lacking about the U.S. legal system. At the end of the day, you know, this was a significant public assertion about the practices of Facebook, and the protocols at the SEC are an indirect mechanism at a high level, if you think about it, to confront the concerns at hand because the SEC regulates the problem in the society of investor harm. And they do not necessarily directly regulate harm on consumers or children or users of social media. And so at the heart of the SEC complaints are essentially what misrepresentations may allegedly have been made to investors. But what's missing in our U.S. legal system right now is arguably a single federal regulator whose process or tip line can be triggered to help oversee corporate conduct that affects these issues of social media-based consumer protection, children's protection or online content users. 

Dave Bittner: Are there other federal agencies that could step in here? Would this be under their categories? 

Jenny Lee: Absolutely. There's the Federal Trade Commission, the Consumer Financial Protection Bureau and the Federal Communications Commission. And, you know, I think that the existence of this patchwork quilt of agencies is itself, you know, emblematic of a larger issue in the legal system. You know, there's this conflict between the pace at which tech is evolving versus how quickly we can update our laws. So right now at the federal level, these regulations apply to businesses, in terms of the activity in question. And so that is why, you know, any number of different agencies could potentially find a hook in some of the issues. It just depends on what you define to be the activity in issue. 

Dave Bittner: Yeah, I mean, it strikes me that one of the issues here is that it - when it comes to something like social media, you know, that's different than selling a product in a brick-and-mortar store. If I sell a dangerous product and a consumer gets harmed - a child gets hurt or someone - you know, people compare this a lot of times to Big Tobacco. You know, if I have smokers who get - you know, have health issues, well, that's pretty easy to draw a line between. But it seems like online organizations, like Facebook - as you, I think, very well point out - it's a little bit more fuzzy there. It's harder to say who's directly responsible for regulating them. 

Jenny Lee: Right. And I think one of the difficulties, as attorneys and policymakers and other stakeholders might have seen thus far, is that we're also trying to operate in this world where we have inherited legacy legal systems. So I fully agree with your point, that these issues kind of ring true in the consumer protection realm. And I do think that there will be activities that occur, you know, in the major players in D.C. on that front, in terms of consumer protection. 

Jenny Lee: But there's also this other rule, Section 230 of the Communications Decency Act. And that one is technically belonging to the FCC, the Federal Communications Commission. So it's almost like the online, you know, economy of communication arguably also belongs in our federal communications regulations. Really, what Section 230 has done is, you know, decades ago, initially allowed for there to be a safe harbor for tech companies. 

Jenny Lee: You know, back in the '90s, the world was a very different place. And many of these tech companies were much smaller. And pioneering technologies were being put forth. And the Congress at that time wanted, in the CDA, to allow the internet economy to flourish. And so that there was a need, according to the policy underlying that law, to allow there to be a safe harbor, so that if you're running, say, a platform - whether that be social media or home improvement apps to find contractors or, you know, online reviews about hotels, or whatever it is - that if you're running that platform, you are not to be liable for whatever other people publish on your platform - similar to the idea that if you are a library, you know, you're not liable for any offenses that are, you know, kind of conveyed in the books that might be written by authors that place their books in the library. So at that time it was - the sort of incentive was - you know, there were two parts to the CDA. And forgive me for getting pretty (laughter)... 

Dave Bittner: (Laughter). 

Jenny Lee: I nerd out on these legal issues because I look at this stuff all day. 

Dave Bittner: Yeah. Yeah. 

Jenny Lee: You know it was both - it was two things. It was that the provider of that platform is not to be liable for the actions or statements of the third parties who go on that platform. And the second part was - and by the way, you know, if - there are some good reasons why the platform may want to regulate itself so that people aren't doing harmful or illegal things on their platform. So as long as the platform is operating in good faith and following their own internal rules that they've put together on how to regulate themselves, then they also are, you know, sort of insulated from liability as well. And if you fast-forward now from there to, you know, 30 (laughter) years later... 

Dave Bittner: Right. 

Jenny Lee: ...We're in a very different world now. And a lot of these issues that were raised in the whistleblower testimony but also are raised by others in Silicon Valley, like Tristan Harris, for example - you have these issues, whether they be harm to young girls or misinformation or whether it be, you know, conservative voices that are being so-called sanctioned or censored or whether it be - you know, to my reference to Tristan's work and his team, you know, this idea about the attention economy and, like, the downgrading of the human experience when people are addicted to screens or are kind of living in a prolonged state of conflict and antagonism or destruction, isolation, you know, fake news - that as a society that is a negative thing to be happening. 

Jenny Lee: So all of these issues - you know, whatever the beef might be, I think, really squarely collides with the Section 230 thing because (laughter) the Section 230 liability exemption is really what incentivizes all companies and, you know, sort of establishes who's going to be accountable or what are the consequences of the actions of different people that use social media? And also, you know, to the whistleblower's point, what are the consequences of creating or promoting or enhancing algorithms that put forward a specific intended effect? 

Dave Bittner: Well, on the other side of this testimony, in your estimation, what are the methods that we have available to us that are likely to actually see meaningful change? Do we have tools? Can the regulators, you know, turn the dials to put us in a better place here when it comes to, for example, Facebook? 

Jenny Lee: What is interesting on the consumer protection front is that there are certainly - there's this Section 5 of the FTC Act, which also exists in a similar form in Section 1031, 1036 of the Dodd-Frank Act, and that is kind of this ban on any practice that is unfair, deceptive or abusive. And, you know, you could take a page out of the playbook of the FTC and the CFPB, where the regulators have for a very long time identified specific conduct that seems unfair or abusive, and as long as they can assemble evidence, if you will, that they've met the legal elements of those claims, then they can declare a practice to be, you know, a violation of that. It's called UDAAP - Unfair, Deceptive, Abusive Acts or Practices. And there's been many - I think there's decades of precedent of the FTC using the UDAAP authority to go after privacy issues. 

Dave Bittner: You know, listeners of our show have heard me wonder repeatedly if we need to have some sort of equivalent of the FDA for algorithms online, you know, to put companies - these social media companies in the position of, first, having to demonstrate that their algorithms do no harm, you know, in the same way that pharmaceuticals have to go through testing procedures. It's a bit of - admittedly, it's a bit of a pie-in-the-sky idea of mine. But I - since you are an expert in this area, I thought I would run it by you and see. Is it an idea that has any merit whatsoever? 

Jenny Lee: Well, I think that even the proposal, I believe, that was put forth after the hearings in Congress this spring by Facebook was one where - you know, between the three companies that testified, Facebook, Twitter and Google, Facebook was the only one at that time that actually made a suggestion to set up a protocol for, you know, this issue relating to the adequacy of their self-monitoring efforts. And I think there's a lot of merit to your idea in the sense that when we're still trying to understand the technology and we're still trying to define what we think the harm is (laughter), it is really great to not jump too fast to write rules that will not make sense or that will be antiquated in just a few months or years. 

Jenny Lee: So, you know, what's really nice about your analogy with the FDA is that I understand that, in the FDA, there are also panels of experts that will be, you know, congregated and opine on different potential therapies and drug solutions. And in general, I think I hesitate whenever we're creating a new bureaucracy because I do think that it's much more elegant to be as efficient as possible and that - having myself been a former CFPB official and now working with the regulators and the government on a daily basis, there certainly are challenges that just - government processes make things sometimes more slower in pace than what the private sector can accomplish. But with that caveat in mind - right? 

Dave Bittner: Yeah. 

Jenny Lee: ...I mean, I do think it's great to have, like, a data-driven approach where we can try to, you know, assign this work to a particular body. And maybe it's not an agency, but maybe it could be even a - kind of like a quasi public-private partnership, or maybe it's like a private agency first for self-regulation, where the government has oversight on top of that or whatever it might be. 

Dave Bittner: Yeah. 

Jenny Lee: I think it does make a lot of sense because right now there are so many different concerns from both, you know, sides of the aisle. And, you know, we're seeing in this political environment that these issues are creating some strange bedfellows that it would be great if we could just first define what is bothering us and what we want the legislators to actually act upon in this kind of holistic, data-driven way. 

Dave Bittner: Yeah. Yeah, I mean, that's a great point. It's hard to say what you want when you can't get people to agree on what the ideal outcome might be. 

Jenny Lee: I certainly think there's low-hanging fruit. Like, for example, I recently learned about Ashton Kutcher's nonprofit and the work that their team is doing to combat online sex abuse or human trafficking. I mean, there is just some low-hanging fruit where I don't think we'll disagree that these are things that need to be addressed. But the more nuanced issue that you raised about kind of the effects, more specifically, of algorithms and some of these things about, you know, like, who should decide (laughter)... 

Dave Bittner: Right. 

Jenny Lee: ...Which, you know - like, which attractive photograph of a female celebrity should be taken down because young girls - I mean, I have a whole lot of young girls in my family and a daughter myself. And these issues are very important, but it's sort of like, how do you implement a solution to that concern? And who decides what the algorithm should do or say and what the thresholds would be? I mean, those issues are a lot more, I think, nuanced than some of the ones where we can just all agree are horrendous problems in society that need to be addressed right away. 

Dave Bittner: Yeah. Yeah. Where do you suppose a good place is to begin, then? I mean, there are - obviously, there are places where people are in agreement and places where there is wide disagreement. Where do you suppose a good place to start would be? 

Jenny Lee: Well, I think it would be good to take that task in two parts. So probably the first part would be that at least we can make incremental progress on issues that are low-hanging fruit, things that most people would agree on, like online child sex abuse or human trafficking issues or things that we can perceive are occurring on the internet that are already defined to be clearly illegal, if not criminal. And so we can at least try to prioritize those things to begin getting something done right away. 

Jenny Lee: The second part of it, I think, is just, really, we need to also get some input and feedback from the tech companies and, based upon their expertise, try to better understand what can be done with regard to these algorithms and identify the root of the issue. How much of this is due to the fact that maybe there's foreign language translation issues, given that these are global companies versus that we can blame the algorithm itself? And I think until we understand the root cause of the issue and also define what is the issue we're trying to address, we're not going to really be able to do a good job of crafting the solution. But at least those are a couple of immediate steps where we can begin to make progress. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: Really interesting conversation. She's super knowledgeable. 

Dave Bittner: Yeah. 

Ben Yelin: And I was really interested, particularly, when she talked about how we don't have, you know, a regulatory regime that makes enforcement against companies like Facebook easy. They're subject to regulation from a bunch of different federal agencies, and sometimes that kind of ends up shielding them because there are these jurisdictional disputes between the FTC and the SEC. So... 

Dave Bittner: Right. They all say, not it (laughter). 

Ben Yelin: Yeah, exactly. So it ends up not being, you know, a very effective or valuable system for the consumers, and that's kind of a weakness of our system. So I thought that aspect was really interesting. And yeah, it was a great interview. 

Dave Bittner: Yeah. Our thanks to Jenny Lee for joining us. We do appreciate her taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.