Privacy has become almost a fundamental right.
Ameesh Divatia: If you don't pay for a product, you do become the product.
Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello there, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: On this week's show, Ben looks at a plan congressional Democrats are cooking up for Section 230 of the Communications Decency Act. I've got the story of a clever technique police officers are using to prevent being livestreamed. And later in the show, my interview with Ameesh Divatia from Baffle on whether it's time for a national privacy referendum.
Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: All right, Ben, let's dig into some stories. What do you got going on this week?
Ben Yelin: So my story comes from the website protocol.com, and it is about congressional Democrats' plan to limit Section 230 of the Communications Decency Act.
Ben Yelin: So this was a bill that was introduced in the Senate by Senators Warner, Hirono and Klobuchar, sort of spanning the ideological spectrum of congressional Democrats. Senator Hirono is one of the more progressive members, Klobuchar sort of in the middle. And Senator Warner, among Democrats, is on the more conservative side. And they introduced a comprehensive bill called the SAFE TECH Act, which I assume is an acronym.
Dave Bittner: (Laughter) Count on it.
Ben Yelin: Yes. I was not able to figure out what that acronym stood for...
Dave Bittner: Right, right.
Ben Yelin: ...By press time today.
Dave Bittner: (Laughter).
Ben Yelin: So this bill makes a number of small but significant changes to Section 230. Just by way of background - I know we've talked about this on this podcast and other podcasts, but Section 230 is the provision that shields publishers of content from legal liability for what users post on their platforms.
Ben Yelin: So what this bill would do is provide that online platforms wouldn't be able to claim that immunity for violations of federal or state civil rights laws, antitrust laws, cyberstalking laws, human rights laws or civil actions regarding a wrongful death. So those are a bunch of very interesting carve-outs.
Ben Yelin: And perhaps the most interesting provision in this bill, it changes the type of content that would be subject to this immunity. So what the language of the act says now is that no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information - emphasizing that word - provided by another information content provider.
Ben Yelin: This bill would replace that word information with speech. And what the purpose of that is, is to avoid shielding these companies when, you know, people, for example, use social media sites to facilitate the sale of guns. That's not speech, per se. I mean, that's facilitating...
Dave Bittner: It's commerce.
Ben Yelin: It is, yeah, and potentially facilitating an illegal sale.
Dave Bittner: OK.
Ben Yelin: So they would be held liable for that under this revised piece of legislation.
Ben Yelin: What are the prospects of this bill? My general caveat, no pun intended, when we're talking about anything related to Congress is always bet on inaction as opposed to action.
Dave Bittner: (Laughter) Right.
Ben Yelin: I think this is the starting point for what could be the negotiation around a more comprehensive piece of Section 230 reform. You know, I think one of the things we've talked about is how each political party has its own problems with Section 230. So the progressive side, represented by the authors of this bill, think that these platforms don't do enough to restrict content. They think that they're facilitating harmful conduct. We talked about the illegal sale of firearms, but all other types of illegal content.
Ben Yelin: They believe that advertisers aren't facing proper restrictions. There was an incident where one advertisement that was posted on Facebook had racist undertones that potentially could've violated - the way it was posted could've violated federal civil rights laws. That would no longer be subject to a liability shield if this bill were to pass. So that's sort of the progressive, Democratic critique.
Ben Yelin: The Republican critique is that these companies have too much power to restrict content. They are biased against political conservatives. They're shutting down free speech, and they should be held liable for these content moderation decisions. And so there's this conservative pushback against Section 230 as well.
Ben Yelin: So I think the best way to look at this piece of legislation is this is sort of the opening bid on the part of the Democrats. This is what we would like to do to curtail the scope of Section 230, and, you know, maybe we're willing to hear what the Republicans want to do, and maybe we can meet somewhere in the middle. So that's my general perspective on this.
Dave Bittner: Help me understand here. To grossly oversimplify this, is it that the Democrats have their dander up over what folks can leave up and the Republicans are upset about what the platforms might take down?
Ben Yelin: In an extremely general sense, I think that's absolutely true. And, you know, we can see that reflected in the political dialogue over the past several years. I mean, Democrats have talked about how social media sites haven't done a sufficient job policing false information, abusive information, abusive practices. And Republicans have complained about deplatforming, cancel culture, taking President Trump off of Twitter, silencing conservative speech, you know, using algorithms to suppress conservative stories like the one during the election season about Hunter Biden. So I think it is reflective of their general critiques of the law.
Dave Bittner: Does that have anything to do with Section 230, though? I mean, a platform deciding to pull something off - that's different than what - my understanding of 230, which is that it shields them from the things that might be posted.
Ben Yelin: Yeah. It's a critique - to be honest, doesn't really make much sense. And, in fact, if Section 230 were curtailed, then these platforms would certainly take down far more content than they already do because they'd be incessantly concerned about being sued.
Ben Yelin: I think that critique is more kind of a gut Trumpian critique saying, it's not fair that these people are being shielded from liability when they're also silencing our voices.
Dave Bittner: I see.
Ben Yelin: And they're upsetting us, so we want them to be able to be sued. I mean, I think that's kind of the nature of it.
Dave Bittner: Sort of a both sides-ism sort of thing.
Ben Yelin: Yeah, yeah. I think there are - certainly a legitimate critique you could make about content moderation practices and whether it is good policy on the part of these platforms to limit certain types of speech. But I agree with you that it doesn't, at least in my understanding, actually relate too much to Section 230, whereas this bill really would make substantial changes to Section 230, for better or worse.
Dave Bittner: Where does it go from here?
Ben Yelin: So my guess is that this will lead to a series of committee hearings. The Senate is now controlled by Democrats. You know, Senator Warner is now the chair of the Senate Select Committee on Intelligence. So, you know, this is the type of thing where I would expect to see at least informational hearings on this legislation, testimony from experts, including representatives of the tech companies. You know, we know that particularly Twitter and Facebook have been more open about potentially reforming Section 230. So I'd be interested to see if they drag Zuckerberg back up on Capitol Hill to get his take on this.
Ben Yelin: And then we're still a long way from this type of thing being enacted into law. It has to go through the sausage-making of legislation. And, you know, if I put on my crystal ball, I still doubt that this bill, even, you know, in a slightly modified form, would become law, at least in this session of Congress. But I suppose you never know. But this is just the first step.
Dave Bittner: (Laughter) Right, right.
Ben Yelin: And it's sort of the opening salvo in laying out what the Democratic priorities are as it relates to Section 230.
Dave Bittner: I appreciate your learned cynicism when it comes to this (laughter).
Ben Yelin: I know. Sometimes I feel like I'm too cynical, and then I watch, you know, five minutes of C-SPAN and realize maybe I'm not cynical enough.
Dave Bittner: Right, right.
Ben Yelin: Yeah.
Dave Bittner: All right. Well, it's an interesting story. We'll have a link to that in the show notes, as always.
Dave Bittner: My story this week is weird.
Ben Yelin: Great is what it is.
Dave Bittner: Kind of delightfully weird. This is - particular telling of this story comes from VICE News, written by Dexter Thomas. Let me lay out what happened here. A gentleman by the name of Sennett Devermont, who is a Los Angeles-area activist - seems like perhaps a bit of a gadfly when it comes to covering the police, and I mean that in a, you know, descriptive way...
Ben Yelin: Yeah.
Dave Bittner: ...As opposed to a judgmental way. But he was - went into the police station to request some information about some interaction that he had with a police officer. And while he was interacting with the police officer who was helping him with his requests, the police officer noticed that Mr. Devermont was livestreaming their interaction.
Dave Bittner: Now, Ben, as we know, it is well within a citizen's right to film the police while they're working, while they're doing - in the course of their interactions. That is something that we're allowed to do. It's protected, right?
Ben Yelin: Yes, absolutely.
Dave Bittner: So this gentleman is livestreaming the police officer, and the police officer is not happy about that. Officer Billy Fair - Sergeant Billy Fair is the officer. And he asks how many people are watching this, and Devermont replies, enough. Devermont has about 300,000 followers on Instagram, so a sizable audience. So Sergeant Fair pulls out his personal cellphone, plays with it a little bit and starts playing some music.
Ben Yelin: Yeah, it seems like he's searching through his iTunes library and trying to find the perfect song for the situation.
Dave Bittner: (Laughter).
Ben Yelin: And he comes up with "Santeria" by Sublime. So first of all, shame on you for getting that song in my head. That's going to last for several days. And I apologize to our listeners 'cause now we've just done it to you.
Dave Bittner: Right.
Ben Yelin: But the reason he's playing this music, as the article says, is not just his love of '90s stoner music. He is trying to get Instagram to take this video down because of a copyright violation.
Dave Bittner: Right.
Ben Yelin: So his theory is that if he plays copyrighted music in the background, that will trigger Instagram's content filters to take this entire video down.
Dave Bittner: Right.
Ben Yelin: He's trying to kind of subvert the system. And, you know, by playing a song that's subject to copyright restrictions, then the video by this activist would not make it on Instagram.
Dave Bittner: Right.
Ben Yelin: There are a couple of problems with his theory. First and foremost, Instagram's terms of procedures changed last year, and there are more - I think the revised procedures allow playing of songs to a greater extent. As long as there's some video element to it and the purpose of the video is not to simply play copyrighted music, Instagram is not going to take it down. And I'm not sure that this law enforcement officer was aware of that policy 'cause he kind of seemed to think, with a little smirk on his face, that he had found the secret solution to this activist taking videos.
Dave Bittner: Right.
Ben Yelin: Yeah. So that's one problem. The other problem is this video has now been up for - let's see. When was this posted? It's been up for a while, meaning Instagram content moderation, you know, whatever artificial intelligence they use for that didn't take it down. And now this article has been out there. You know, it's been public. I guess this was taken February 5, so about five days ago as we're recording this, and the music still hasn't been taken down.
Dave Bittner: Yeah.
Ben Yelin: So just because he plays licensed music, that didn't actually cause Instagram to take down the video. But it was a nice try. I'll give him credit for that.
Dave Bittner: Well, and evidently, he wasn't the only one to do it. Mr. Devermont, the activist here, had another case that he had filmed an officer, and the officer was blasting the song "In My Life" by the Beatles. And, of course, the Beatles are famous for being right on top of things when it comes to having their music taken down.
Ben Yelin: Yes, although somehow they let their entire catalog get purchased by Michael Jackson. But I guess that's a story for a different podcast.
Dave Bittner: (Laughter) So I guess this tactic is making its way around police officers, and they're trying this. And if nothing else, it's annoying to try to have an interaction with a police officer who's on duty, and the police officer just starts blasting loud music because they don't appreciate what you're doing. That seems - I don't know.
Ben Yelin: Not nice.
Dave Bittner: Rude (laughter).
Ben Yelin: Yeah. I mean, the law enforcement officer was being kind of intentionally dismissive. And when he started playing the music in the video, the activist kept saying, like, hey, I want to talk to you about this, and the guy wouldn't respond. And he was like, I can't hear you...
Dave Bittner: Right.
Ben Yelin: ...Of course, because this loud copyrighted music was playing.
Dave Bittner: Yeah.
Ben Yelin: But I don't think, you know, besides potentially annoying this videographer who seems to film a lot of interactions with law enforcement, I don't think this will have any actual impact because of Instagram's actual content moderation procedures, which would probably not lead to this video being taken down. So I think law enforcement in this case is kind of being too clever by a half in trying to use this tactic.
Dave Bittner: Yeah. I do remember a case a few years ago where someone had posted a video of their - I think it was a toddler who was dancing to a song by Prince that was playing in the background. And Prince had the song - had the video taken down off of YouTube, I think it was, and, you know, got some blowback from that, people saying, really, Prince? Really? I mean, you know, it's a kid - it's a toddler who's moved to dance by your music, and you're - you can't - you know (laughter), so it seems as though the platforms are kind of moderating their view on this.
Dave Bittner: And I think, also, as time goes on, a lot of the rights holders are moderating their view, and they're seeing that, you know, there's promotional value to this. And you don't want to just be seen as the bad guy who, you know, takes everything down no matter what.
Ben Yelin: Right.
Dave Bittner: It's sort of shooting yourself in the foot.
Dave Bittner: It's interesting here that the Beverly Hills Police Department did email a statement to the folks at VICE, and they said, the playing of music while accepting a complaint or answering questions is not a procedure that has been recommended by Beverly Hills police command staff. And the videos are currently under review, as well they should be (laughter).
Ben Yelin: That is such a funny statement 'cause it's saying absolutely nothing and just, like, using this passive voice, whereas...
Dave Bittner: Right, yeah.
Ben Yelin: This is not allowed by whomever, and we're looking into it.
Dave Bittner: Mistakes were made.
Ben Yelin: Yeah. Mistakes were made by some. We're not going to say who, but - yeah. You know, so they are trying to keep a good PR angle here, yeah.
Ben Yelin: You know, I think you're right that the - and this story obviously doesn't really relate to the artists themselves. I have no idea if Sublime even still exists or how carefully they protect their copyrighted music material. But, yeah, I mean, I think from - to make a general point from the artists' perspective, it's not worth the bad publicity to go crazy in trying to take their music down from platforms when they're used incidentally. And, you know, so I think they are careful about that.
Ben Yelin: But in terms of this being a law enforcement tactic, I don't think it's going to prove to be an especially effective tactic in preventing people from obtaining body camera footage or inhibiting people's First Amendment rights to film law enforcement interactions.
Dave Bittner: Yeah, yeah. It's like I said at the outset, it's a weird one.
Ben Yelin: But such a great story. Cracks me up.
Dave Bittner: All right. Well, again, that's from the folks over at VICE, and we'll have a link to that in the show notes.
Dave Bittner: We would love to hear from you. If you have a question for us, you can call in and leave a message. Our number is 410-618-3720. You can also send us an email to email@example.com.
Dave Bittner: All right, Ben, I recently had the pleasure of speaking with Ameesh Divatia from Baffle. And he has the notion that it may be time for a national privacy referendum. Here's my conversation with Ameesh Divatia.
Ameesh Divatia: It's amazing how privacy has suddenly become almost a fundamental right. I'm sure you saw the outrage very recently about WhatsApp changing its policies or attempting to change its policies and the absolute backlash that they received. So just a simple example of how everybody thinks that privacy is their God-given right and it has to be protected at all times.
Dave Bittner: I mean, do you think that this is sort of people catching up, waking up? I mean, it seems like certainly, for the past few years, we've seen, when I think about social networks, that they've been running rampant, sort of gathering up all of our information willy-nilly.
Ameesh Divatia: Yes, that's true. And I think, you know, there's various adages used for this, but I think one of them that actually rings true in this case is, if you don't pay for a product, you do become the product. So the fact that you don't pay for a social networking service means that whatever you are doing is going to be monetized by that social network or even a commercial entity if it's just even one of your vendors, somebody you buy things from. So it is definitely something that is becoming very central to how people do business. And it's also becoming very important for them to realize that they are being used as an asset. Nothing is free.
Dave Bittner: Yeah. We've had GDPR come to the European Union. And here in the U.S., California has had the CPRA. Where do you think we're headed next? Do you think we might see something on a national level here in the U.S.?
Ameesh Divatia: Absolutely. I think we are going to see something at a national level. The timing of that is really dependent on a lot of other things that are going on, as we all know, from a political perspective. But as I'm sure you know well, there's more than 30 bills that have been introduced in Congress, and they're being actively debated or they will be actively debated over the course of this year.
Ameesh Divatia: If anything, they're a little bit late in getting there. I think the expectation was that this would happen a lot earlier.
Dave Bittner: What about the big players in the space? I'm thinking about organizations like Facebook and Google, again, some of these big social networks and platforms that are in the business of gathering up this data. It seems to me like, you know, they're not going to change their ways easily. They're going to - there's going to be a lot of pushback from them, and they're large companies who can pay for a lot of good lawyers.
Ameesh Divatia: That is true. But I think they also understand that there is two sides to this equation, right? While the fact that they have all of that data is a wonderful asset, they understand the liability aspect of it. And, of course, compliance regulations are making it more of a monetary issue as far as fining them and all of that. But I think they understand that there is a moral responsibility. And more importantly, if they do the wrong thing - case in point, you know, all WhatsApp was trying to do was share information with Facebook, and there was massive backlash. They understand that they have that responsibility. So I'm not sure if they're going to push back as hard.
Ameesh Divatia: I think what they will push on is to make it more homogenous across the country so that they don't have to worry about every state trying to create their own regulation. So I think they'll be supportive of that effort to make it more at a national level.
Dave Bittner: Now, do you suppose we could see something on a global scale? I mean, I'm thinking about how, you know, we have international standards for things like human rights, and many nations sign on to them and agree with them. Not everyone does. But, you know, those are things that have been worked on over the years, and they're standards when it comes to those sorts of things. Is there a possibility that we could see global standards when it comes to online privacy?
Ameesh Divatia: I think we will, but one step at a time, right? I think every nation still is grappling with this. There's only a few that have - obviously, GDPR is probably the furthest along as far as geographical impact is concerned. It's all across the EU. Brazil has a regulation. So do Singapore and Australia and a couple other countries.
Ameesh Divatia: There's a lot of them that are still underway, right? India is still underway. The U.S. is absolutely still underway. So I think it will start happening, and it will start becoming pervasive. But the global standard is definitely a ways away in my mind.
Dave Bittner: What about this notion of perhaps having some sort of national privacy referendum? And is that a possibility? Is that a practical thing that could happen?
Ameesh Divatia: I think it is absolutely practical because if the laws are more homogenized and the laws are easier to understand, it becomes easier to enforce them. And if it becomes easier to enforce them, it also becomes easier to adopt them from a technology perspective, right?
Ameesh Divatia: Technology vendors are looking to do as much as possible to protect the data, but they also want to make sure they don't have to do custom implementation based on the geography. So I think it does make it easier for them to provide the support that's needed to technology companies trying to adopt these regulations, and it also makes it easier for enforcement if it is at a national level.
Dave Bittner: How do you respond to the argument that regulations like these may be easier for the larger companies to implement - again, they have the resources to do so - but it could be a barrier for smaller companies who want to enter this space? These regulatory standards could be burdensome to them.
Ameesh Divatia: I think on the face of it, it will appear like that, but that's exactly where the process of actually enacting them will play out. I mean, you've just looked at what's happening with California, right? CPRA has been enacted, but it won't go into effect until 2023. So it is still two years out. So it gives enough time for the - both the legislative process and the judicial process to negotiate it so that they can come up with the right set of requirements that can be very - I wouldn't say easily implemented, but implemented without a lot of pain.
Ameesh Divatia: Also, we have an entire ecosystem of vendors - right? - security vendors, privacy vendors - who are absolutely in the middle of innovating to make sure that they can make it easier.
Dave Bittner: Yeah. You know, that's a really interesting aspect of this. Not long ago, I saw a story about someone who was putting together some sort of system. I believe it was using artificial intelligence or machine learning - you know, something like that - to basically go through a lot of the EULAs - the end user license agreements - for these different platforms to try to pull out what they really meant and put it in terms that people could understand. And I thought that was a fascinating use of technology to try to streamline this to give people a better understanding so they know what they're up against.
Dave Bittner: What sort of ways do you think technology could do that, could help with implementation and streamlining and enforcement of all of these privacy standards that we may see coming?
Ameesh Divatia: Sure. So it always starts with data discovery first, right? What is sensitive data? And, again, the regulations are doing a good job of defining what is considered sensitive data from an individual's perspective. But the tools have to come in and actually detect where that sensitive data is sitting because most companies don't even realize where it is.
Ameesh Divatia: So once you discover that particular data and you classify it as sensitive, that's when the policy infrastructure kicks in as to what do you do with that. So, for example, in some cases, you may want to just completely mask that data to be never used again, like a Social Security number. Or you may want to protect it in such a way that it can be processed downstream.
Ameesh Divatia: So once that has happened - think of it as a pipeline, so discovery, classification and then storage - so we have to store it in a protected manner - and then, finally, processing. So the last stage - it is possible now with technologies known as privacy-enhanced computation or privacy-preserving analytics to actually process that data downstream without revealing the contents of that actual record or actual information.
Ameesh Divatia: So that's how the technology industry or the ecosystem is actually addressing that - to be able to do this, to be able to collect data but still be responsible about it. And this is what every company will have access to, especially when they go into cloud infrastructure. So it will really level the playing field, and it will not be unfairly tilted towards the larger companies who can adopt these regulations and the smaller companies who cannot.
Dave Bittner: Yeah, that was actually - you lead to my next question, which is, you know, is this an opportunity for companies to provide these sorts of services so that folks can have turnkey ways to make sure that they are meeting the requirements of the regulations and also doing right by their customers?
Ameesh Divatia: I think there is. And I think this is where cloud providers are going to play a very big role - right? - because of the fact that the - again, much like the laws have to homogenize, the infrastructure needs to be homogenized as well so that these tools can work seamlessly. And that is well underway.
Ameesh Divatia: There will be some burden to that. I don't think a small doctor's office will be able to automatically comply just because they can buy some software and put it on their server. They may have to go to a service provider's cloud, which makes it easier for them to adopt these tools.
Ameesh Divatia: The good news is, given the pandemic, a lot of these - call it data transformation and cloud migration has actually been accelerated by it. And that is an opportunity to actually adopt these new tools and comply in the process to make sure they can win their customers' trust. That's really what it's all about at the end, right? Privacy is about winning the customers' trust.
Dave Bittner: Yeah, it seems to me like, by prompting people to move to things like cloud services, I mean, you know, you may get them to come to cloud services to beat these regulatory standards and so forth, but at the same time, in doing so, a whole lot of other things about their systems are going to be better. You know, they're going to be able to take advantage of some of the security things and the backup things - all those sorts of things that come with moving things to the cloud. And this gives them the impetus to do so, to stop lagging, to stop putting it off.
Ameesh Divatia: Exactly. It is all part of that overall digital transformation team, which was already underway. The pandemic just accelerated it.
Dave Bittner: Yeah, absolutely.
Ameesh Divatia: Again, I just want to emphasize the fact that, you know, privacy has suddenly become very, very personal. Every individual considers that a right more than anything else. And it is time for the government or the legislation to actually catch up and make sure that they enact the right kind of security laws. Security laws are best enacted at a centralized level because it makes - you know, it makes it easier to enact them. But enforcement will have to be local. So the states will have to step up to enforce them.
Ameesh Divatia: And I think that's where we'll achieve this happy medium where consumers are willing to share their data, knowing fully well that the data collectors will be very, very responsible about them because if they're not, their business will be adversely affected. It's no longer - security and, as a consequence, privacy is not just a necessary evil, right? It's a competitive differentiator. If you take care of your data, you will get more business from your customers.
Dave Bittner: All right, Ben, what do you think?
Ben Yelin: Can I get on my soapbox for just a minute?
Dave Bittner: Absolutely.
Ben Yelin: So, first of all, I certainly thank the guest. It was a provocative interview. I thought it was really interesting.
Ben Yelin: I am firmly against national referenda. I'm largely against state referenda. But a particular...
Dave Bittner: Well, you're a California boy, so you know of what you speak, right?
Ben Yelin: I do know of what I speak, and it is certainly not a pleasant voting experience to get a, you know, 500-page handbook explaining what these propositions mean...
Dave Bittner: (Laughter).
Ben Yelin: ...Where no layperson could possibly understand what they're voting on.
Ben Yelin: You know, national referenda - first of all, we don't have a national voting system in this country. There are no national elections. They're all administered at the state level. The only national office is the office of the president, and that's determined through the Electoral College, which is done at the state level. So we just don't have that infrastructure in place to even have that referendum.
Ben Yelin: More broadly, the reason we elect people to make decisions for us is we can't all be experts on the policy issues of the time. And we elect representatives to gain expertise for us and make these decisions on our behalf. And if we're not happy with their decisions, we can replace them.
Ben Yelin: What we've seen in California is they try and get the public to vote on things where the public might not have preconceived views on the subject, but it's very difficult for the public to accurately understand what they're voting on. And I think the same danger would be present if we had a national referendum on privacy legislation.
Ben Yelin: Your interviewee tried to say, well, you know, this would simplify things because companies wouldn't have to go state by state to figure out what they're - what kind of laws they'd have to comply with. But leaving it up to the public just creates all of these new dangers.
Ben Yelin: We saw it in California last year when there was a referendum making technical corrections to the California Consumer Privacy Act, and it was a disaster. I mean, I think we talked about it on this podcast. The experts in the field, the activists on either side had a hard time wrapping their head around what the proposition would actually do, let alone what that meant to the layperson who doesn't know how to operate a smartphone.
Ben Yelin: So that's my soapbox. I'm against national referenda. It's an interesting idea. I've seen it proposed in other contexts in the past. But not my cup of tea.
Dave Bittner: Yeah. So the way you would propose coming at this is contact your representative, let them know that some sort of national legislation is something you'd support and pressure them that way?
Ben Yelin: Absolutely. Or, you know, part of activism is organizing. It's not just contacting your legislator, but, you know, getting a group of interest groups together, a group of similarly minded activists and coming up with pressure campaigns, you know, testifying in front of your members of Congress, doing what you can at the state legislature. That's the kind of tough work that makes democracies tick. I think that's preferable than having a national referendum on an issue that, even if it were simplified, is still relatively complex.
Dave Bittner: Right. All right. Well, again, our thanks Ameesh Divatia from Baffle for joining us. As Ben said, certainly a provocative idea and food for thought, absolutely.
Ben Yelin: Absolutely, yes.
Dave Bittner: That is our show. We want to thank all of you for listening.
Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.