Caveat 3.2.23
Ep 163 | 3.2.23

Cybersecurity and SMEs.


Melanie Teplinsky: It's like you are asking someone to go out and buy a car and saying, but what you need to do is buy the wheels and the engine and the steering wheel and figure out how to put the car together yourself. And that's what these small companies are trying to do when they're trying to craft a cybersecurity program.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben and I delve into what happened at Supreme Court oral arguments for Gonzalez v. Google, a major Section 230 case. And later in the show, Ben speaks with Melanie Teplinsky, senior fellow at American University's Washington College of Law. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we are going a little bit off form this week, and we're going to take the - our conversational part of today's show is going to be all about Gonzalez v. Google and the Supreme Court oral arguments, which I know you sat next to your computer with a big bucket full of popcorn listening to while the oral arguments were being broadcast, right? 

Ben Yelin: Yeah. If you remember, like, the longest movie you've ever watched... 

Dave Bittner: It's like watching the entire "Lord of the Rings"... 

Ben Yelin: Trilogy? Yeah. 

Dave Bittner: ...Trilogy in one sitting (laughter)? 

Ben Yelin: It certainly felt like it was that long. I mean, I believe oral arguments lasted something like an hour and 45 minutes... 

Dave Bittner: OK. 

Ben Yelin: ...Which is unusual. There were three attorneys who argued in front of the Supreme Court because it's both the parties, and then the government took Gonzalez's side and kind of had their own separate argument. But yeah, it was definitely a marathon. I did both the transcript and the oral argument, just so I could kind of fully understand it in all of my senses, and I got a lot of thoughts. I got a lot of thoughts. 

Dave Bittner: All right. Well, that's let's go through it together here. How do you want to come at this? 

Ben Yelin: So just a very quick overview. There were actually two cases that were heard last week. One is Gonzalez v. Google, and the other is Twitter v. Taamneh. The Taamneh case is more - well, actually, to give some background first, just in case you haven't been following these stories - the facts in each of these cases, for legal purposes, are basically identical. It's a family who's been affected by terrorism. In the Gonzalez case, this was a young woman who was killed in the 2015 terrorist attacks in Paris, and they are suing social media companies. So in the Gonzalez case, it's Google, the parent company of YouTube. And in the Taamneh case, it's Twitter. The Twitter case turned more on the exactitudes of the anti-terrorism statute. Exactly - the justices are trying to discern exactly what the statute means by aiding and abetting terrorists, which is very interesting, but not as interesting for our purposes. 

Dave Bittner: OK. 

Ben Yelin: For our purposes, Gonzalez v. Google is really the marquee show in town because that's getting at the intricacies of Section 230 of the Communications Decency Act, which deals with immunity for publishers of content. So this is a bill that was drafted in 1996. It is a bill that predates the era of social media algorithms, although some of the oral argument - the justices were contesting whether that was actually the case. But in Gonzalez, the justices are wrestling with how to interpret this provision where companies are given immunity for their content moderation decisions on their platforms - how to interpret that in light of modern algorithmic technology. 

Dave Bittner: Yeah. 

Ben Yelin: So the specific allegation here is that YouTube is recommending videos, and that is - it's not simply publishing content. That is the creation, if you will, of YouTube and its parent company, Google. And therefore, Google should not be immune under Section 230. And that was essentially the argument of the attorney for Gonzalez. He focused pretty intently on the thumbnails. So you search Google for an ISIS video, and it's going to recommend content based on the fact that you searched for that ISIS video. 

Dave Bittner: Right. 

Ben Yelin: And it does that in the form of these thumbnails, which are obviously just little pictures of the next video, with perhaps some accompanying text. 

Dave Bittner: And these days, more likely than not, someone making a weird or funny face. 

Ben Yelin: Yeah, exactly. 

Dave Bittner: Right. 

Ben Yelin: They always got the weirdest screenshot in those thumbnails. The argument of the Gonzalez attorney is that these thumbnails are suggestions. Those suggestions are the product of a conscious, creative action on the part of YouTube. And therefore, they should not be immune under Section 230. Tech companies were freaking out about this case - I think for good reason - because, with the status quo, the immunity extended by Section 230, at least as the lower courts have interpreted it, is quite broad. Unless these companies are explicitly acting as creators - unless it's coming from the mouths of Twitter, from the mouths of YouTube - unless they themselves made the video, they are immune from lawsuit for whatever else happens on their platform. 

Ben Yelin: And the concern among these big tech companies is that the Supreme Court was going to basically rip Section 230 out by the shreds and say that even an algorithm - a content-neutral algorithm that makes recommendations - that doesn't count as the acts of a simple publisher. That, in and of itself, is content. And the justices were very skeptical of that argument. And frankly, I think the Gonzalez attorney did a poor job of explaining himself. It's really - you can kind of think of the parade of horribles if the simple act of having recommendations and these thumbnails led to liability on behalf of Google. And they made a lot of different analogies to kind of illustrate that point. One of them is a simple search engine. 

Dave Bittner: Right. 

Ben Yelin: When you type anything into Google, it makes some type of algorithmic decision of - to determine which results to show first. So one of the attorneys in the case gave a great example of searching the word football. That's going to bring you different results whether you're in the United Kingdom or in the United States... 

Dave Bittner: Ah, OK. 

Ben Yelin: ...Because football means something very different overseas than it does here. So they are making recommendations, even though they're not as explicit as, here are some videos that we think you should like. 

Ben Yelin: But you can see why this would freak out the big tech companies because, basically, this would pull Section 230 out by the roots. Unless results were displayed literally randomly or in alphabetical order, they would be making some type of decision as to which content to display - they would potentially be liable for those decisions. Now, the attorney for Gonzalez said, well, you know, we might be - they might be liable, but are people actually going to sue for, you know, the fact that somebody comes up at the top of a search engine? And what I think the justices were arguing is, yes, potentially, they could be sued for a whole bunch of reasons - maybe just negligence. Let's say Google decided to sort all of its results in alphabetical order. My last name is Yelin. I might suffer some economic harm because my - you know, my results would never make it to the top of the pile... 

Dave Bittner: Right. 

Ben Yelin: ...If that makes sense. 

Dave Bittner: I'd be sitting pretty here over on Planet Bittner (laughter). 

Ben Yelin: Exactly. Yeah. You are definitely sitting pretty. 

Dave Bittner: Right. Right. 

Ben Yelin: So I think there were actually really intelligent, constructive questions from all of the justices of all ideological stripes, kind of questioning where the line should be drawn as to what counts as the creative content on the part of the platform. And they seemed very skeptical that recommendations for future videos would count as creative content beyond the normal organizing function of a publisher. A publisher, just by the nature of publishing something, is going to have to make some type of organizational decisions. If you're a website, you have to choose what's on the home page of your website. If you're a search engine, you have to optimize to figure out, you know, which results should make it to the top. And if companies are held liable for those decisions, then that - I think in the words of the big tech companies - is going to break the internet as we know it. The internet is just going to be flooded with lawsuits. 

Dave Bittner: (Laughter). 

Ben Yelin: And it would go against the spirit of the Section 230 statute, which was designed to let the internet flourish so that we're not punishing companies for their content moderation decisions. 

Dave Bittner: Can I be a little contrarian and ask the question, why do we think the internet as we know it is so darn good? 

Ben Yelin: That's a very... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Legitimate question. You know, I think - I mean, this is something that the justices said in the argument. You would still have an internet; it would just be very anodyne. Like, it would be very vanilla because people would be so afraid of controversial material being posted on a platform that would potentially lead to liability. If you're constantly afraid of getting sued, you're not going to do anything remotely controversial. 

Dave Bittner: Right. 

Ben Yelin: So YouTube, you know, all of its videos would be the most innocuous - it would basically be a children's channel (laughter). Like, nothing that's remotely controversial could be posted... 

Dave Bittner: But would it? 

Ben Yelin: ...On the platform. 

Dave Bittner: I mean, because - I mean, there's a difference between posting and surfacing, right? 

Ben Yelin: Right. So that's actually a major element to this case that's kind of a catch-22. Neither party is arguing against broad Section 230 liability. Even Gonzalez would admit that immunity exists for the fact that ISIS videos are on the website to begin with. 

Dave Bittner: OK. 

Ben Yelin: The question is whether the company should be liable for their recommendations and whether having that thumbnail of video you might like actually counts as a type of recommendation. So it's sort of weird, if you step back and think about it, that the companies bear no responsibility for the fact that ISIS videos are on their site, are being posted on their platform, but they do have some type of responsibility simply by saying, you know, oh, you liked this one video; here's a very similar video about the same topic. So certainly, I think that's something that the justices were kind of prying and trying to get to the bottom of. And the Gonzalez attorney didn't really have a good answer. So I'm halfway through this argument. I'm thinking, yeah, this Gonzalez guy - party seems pretty desperate - or seems like they're in a pretty desperate position. I think they're probably going to lose the case. 

Dave Bittner: OK. 

Ben Yelin: And then the Google attorney comes up, and I think she had some of her own difficulties. So a big question is, does it matter that the algorithm is neutral as it relates to content? So it's not like Google and YouTube designed an algorithm purposely to highlight ISIS videos. 

Dave Bittner: It just seems that way (laughter). 

Ben Yelin: Exactly. It just seems that way to the user. 

Dave Bittner: Right. 

Ben Yelin: They designed an algorithm that, no matter what you search for, they are going to find similar videos that you might find interesting. 

Dave Bittner: Yeah. 

Ben Yelin: So the natural next question to the Google attorney is about that hypothetical. Well, what if - under their interpretation of Section 230 immunity, what would happen if Google really did design an algorithm specifically to favor videos from ISIS? And the Google attorney answered, we think that Google should have immunity even in those circumstances. Even if they have jury-rigged a recommendation to promote content from terrorism because under their views of the four walls of the statute, they didn't create that content. They weren't the ones who created those videos. A third party created those videos, even though they're the ones promoting it through their algorithm. And I think that just really shocked the conscience of these justices who I think were kind of extremely skeptical of extending liability that far. So you have the Gonzalez attorney who really struggled because I think the justices were worried that too much activity on the part of these platforms would subject them to immunity. And then you have Google coming on, and it's the opposite problem. They think that we might be too permissive of some of the decisions that these companies are making. So where does all of that leave us, is the question. It's always very - it is a fool's errand to try to predict the outcome of cases... 

Dave Bittner: (Laughter) But nothing is foolproof to a talented fool. 

Ben Yelin: Exactly. Everybody tries to predict outcomes because of oral arguments. It's successful maybe 50% of the time. 

Dave Bittner: OK. 

Ben Yelin: Sometimes justices are asking questions just to augment their own understanding. They're not revealing their own preferences. But I think we have a little bit of a hint as to how this might turn out. I think what we can rule out is sort of two extreme outcomes that were in consideration prior to this oral argument. One extreme outcome was that Section 230 was basically going to be rendered moot by a Supreme Court decision. So people kind of looked at - Justice Thomas had a concurrent suggesting that Section 230 was outdated in a previous case, and there was the thought that this case might be the vehicle where the Supreme Court decides that Section 230 is outdated. It wasn't intended to protect - or to immunize a lot of the decisions that these large platforms are making at all. And therefore, Section 230 is largely rendered meaningless. 

Dave Bittner: And in that case, if they did that, would then - then it gets tossed back to Congress to make a new law? 

Ben Yelin: It would. And some of what the justices were arguing - and we'll get to that in a second - is it really should be up to Congress to determine what the statute means and exactly what type of activity it protects on behalf of these platforms. 

Dave Bittner: OK. 

Ben Yelin: But that in and of itself is a difficult proposition because Congress has a hard time doing anything. 

Dave Bittner: Right. 

Ben Yelin: And members of Congress disagree on the reasons that they dislike 230, so it'd just be hard to foster any type of meaningful agreement. 

Dave Bittner: OK. 

Ben Yelin: So I think the justices - I think we can rule out that outcome where Section 230 is just destroyed with an ax, right? 

Dave Bittner: Yeah. 

Ben Yelin: The other potential outcome that I think people were worried about that I think we might be able to rule out based on oral arguments is that big tech companies can get away with literally anything. There is this concern - and this kind of reflects the status quo - where there really - there are no lines. There are no restrictions on any decision that these big tech companies makes as long as they are not the ones creating the content. As long as they're just posting the third-party videos, no matter what decisions they make in terms of promoting those videos, in terms of their algorithms, they can, per se, never be held accountable in a court of law. It seems to me that that formulation is unlikely to lead to the conclusion in this case just because justices seemed very interested in this line drawing exercise and not letting these companies get away with, for example, designing an algorithm to promote terrorist content, right? 

Ben Yelin: So we're - I think we're going to land somewhere in the middle. There are a few justices - and this was represented, I think, mostly by Justice Kavanaugh at the decision - who were concerned about the practical effects of cutting away at Section 230. They kind of believe that by destroying 230, you might destroy the internet as we know it. And these are kind of your big-business conservatives who are concerned about whether the free market of the internet can actually prosper if these companies are furiously concerned about liability. 

Dave Bittner: Right. 

Ben Yelin: So I think the solution for these justices might be to largely maintain the status quo of Section 230 and have it really have a wide shield of liability that perhaps doesn't extend as far as shielding big tech companies for promoting terrorist content through algorithms. 

Dave Bittner: And how could they do that? What does the Supreme Court have the power to do in terms of enabling that kind of nuance? 

Ben Yelin: Well, I mean, they are the ones who are going to potentially draw the line here, and it depends on how they define the provisions of Section 230. What's really interesting to me is there is a textualist argument about Section 230 which really looks at the words in the statutes and what the statute was specifically designed to do. The statute was designed so that if companies made content moderation decisions - you know, let's say you're taking certain smut off your website... 

Dave Bittner: Right. 

Ben Yelin: ...They want to incentivize those companies to do that. So they don't want to subject companies to liability for not taking other smut off their website, because by taking some smut off the website, that's at least a hint to law enforcement or whomever that you have the ability to take other smut off the website. 

Dave Bittner: OK. 

Ben Yelin: And they don't want these big businesses to be put in that position. That's why there is that liability shield. 

Dave Bittner: I see. 

Ben Yelin: And the textualist argument was really made most acutely by Justice Jackson, the newest justice, who seemed to be saying this whole conversation is largely out of bounds with the original purpose of the statute, which was about - was specifically about content moderation decisions. And I think in her view, Section 230 wouldn't cover any of the activity that these companies engage in in terms of recommendations and algorithms because that's outside the original purpose of the statute, which was about taking down third-party content. So she might be the textualist vote that sides with Gonzalez. 

Ben Yelin: And then I think there's kind of a murky middle here represented largely by the chief justice and then Justice Kagan, who I think in good faith were trying to find where that line is between simply being a publisher and being the type of organization that's promoting certain content through these thumbnails or through these recommendations. And it's possible that they draw the line themselves based on their own reading of the statute. It's possible that they remand the case to a lower court to have some significant factual finding based on the relevant state common law as to where that line should be, or they leave the status quo where it is, but kind of push the issue over to Congress, saying it's not our job to draw that line. But, you know, we're in kind of a danger zone, so Congress should really step in and come up with some type of equitable conclusion here so that we know what is simple publishing and what is creative content. So I think we're going to end up somewhere in that murky middle, where Justice Jackson is the vote for reading the statute literally and having it only apply to decisions to remove content - third-party content - and then the other justices kind of looking for where a proper line would be in determining these cases. 

Dave Bittner: How did the justices come across in this sort of thing? I mean, was it Justice Sotomayor who joked about that they're not experts on the internet? 

Ben Yelin: It was Justice Kagan. 

Dave Bittner: Kagan, all right. 

Ben Yelin: She got a huge laugh... 

Dave Bittner: That's right. All right (laughter). 

Ben Yelin: ...When she said basically, like, look at the nine of us - we're the last people you want to listen to on the internet. 

Dave Bittner: Right. So how did they - to what degree did their questions indicate knowledge and competence here? 

Ben Yelin: I was pleasantly surprised. 

Dave Bittner: OK, good. 

Ben Yelin: I don't think you have to know - they know the basics. I don't think you have to be an expert in the internet and cybersecurity and data privacy - whatever - to know the basis of the legal issues involved here, and I think they were actually - all of the justices were asking really probing questions because this is a really difficult issue. And there wasn't a lot of grandstanding. This wasn't an ideological battle about Section 230. You didn't hear complaints from the conservative justices about, you know, we should remove Section 230 liability because these companies are biased against conservatives. 

Dave Bittner: I see. 

Ben Yelin: You didn't hear questions from the liberal justices about - or to any large extent about, well, we need to remove Section 230 immunity because not enough of the content on these sites is moderated for misinformation or calls for violence, et cetera. 

Dave Bittner: Yeah. 

Ben Yelin: It really was a good-faith effort to work out this really difficult issue, where it's hard to know exactly where the line is. You know, one thing I can say is I don't think the attorney for Gonzalez was able to properly describe what that line should be. I think he did a poor job of doing so, which is why I thought, before I got to the second part of the argument, that Gonzalez was doomed in this decision. But I also don't think the attorney for Google offered the type of justiciable test that the Supreme Court is ready to adopt because I think their argument would lead to too much immunity for some very explicit decisions that these platforms would potentially make. 

Dave Bittner: So what's our timeline here? Where do we go next? 

Ben Yelin: So this decision - we had oral arguments this week, and the term for the Supreme Court ends at the end of June. My guess is, because this is such a complicated case, that this is the type of thing that'll come out at the very end of this term. So if I had to pinpoint a date, I would say the last couple of weeks of June is when we would get a decision here. And I'm sure, when we get it, you and I will cover it in great depth. 

Dave Bittner: We will swing back around. 

Ben Yelin: Yep. 

Dave Bittner: All right. Well, it's interesting stuff for sure. And stay tuned, right? 

Ben Yelin: Yeah. It was really fascinating. I know most people aren't going to listen to all one hour and 45 minutes-plus of the oral argument, but it was illuminating, and it really was our Supreme Court at its best. I mean, I think all of the questions were really in good faith. They were teasing out different hypotheticals. There were three mentions of rice pilaf, the food dish. 

Dave Bittner: (Laughter). 

Ben Yelin: I'm not going to explain why there were three mentions of rice pilaf in the transcript. I'll let you discover that on your own. 

Dave Bittner: Just a fun Easter egg for our listeners? 

Ben Yelin: Exactly. 

Dave Bittner: OK, fair enough. 

Dave Bittner: Ben, you recently had a conversation with Melanie Teplinsky. She is senior fellow at American University's Washington College of Law. Really interesting conversation - let's have a listen. 

Melanie Teplinsky: I got this - my start in the field - my dad and I used to do the cryptogram every weekend in The Washington Post, and I got interested in puzzles and cryptography at a very early age. I think I was 8 when I did my first cryptogram. And I started reading about the history of cryptography, and I learned that it played a very important role during World War II. In particular, William Friedman, who was a mathematician, figured out how to read the Japanese codes that were being used in the Pacific, and he did that without knowing any Japanese. And I thought that was fascinating, so I tried to understand how they did this. And I learned that a predecessor to the National Security Agency at the time had a group of code-breakers, all of whom were working to use math to figure out how to break codes. And I was a total math geek. I went through school, loved math, favorite subject all the way through. And I decided this was the job for me. And that's eventually how I went to the NSA at age 16 as an analyst. 

Ben Yelin: It's really a fascinating story and I think probably an unusual origin story for those of us who are interested in cybersecurity, so it caught my eye. Before we go further on your biography, though, I do want to talk about this op-ed that you co-wrote in The Hill. It's entitled "We Need a Cybersecurity Paradigm Change." Can we start just very high level? What is the policy problem that you see in the cybersecurity landscape? What's the problem that you're trying to solve with your paradigm shift here? 

Melanie Teplinsky: Sure. So this article was trying to address a long-standing problem. It's the private sector's vulnerability to cyber intrusions, particularly from well-resourced threat actors such as China and Russia. China has been, some folks say, stealing us blind. Russia has also been in our networks. And this problem was brought into stark relief by a variety of headline-grabbing cyber incidents over the last few years. Colonial Pipeline, certainly, in 2021, when we all felt the real-world effects of a ransomware attack on our critical infrastructure. DarkSide, the ransomware group that effectuated this attack, locked Colonial's computers and demanded a ransom to unlock them. And Colonial, which supplies about half of the East Coast's fuel, temporarily shut down its operational technology. And this resulted, as we all know, in panic buying and fuel shortages, price hikes. So this really highlighted the vulnerability of U.S. critical infrastructure to ransomware attacks. 

Ben Yelin: And in terms of your approach to a policy solution, how do you foresee policymakers trying to address the shortage of subject matter expertise in the cybersecurity field, in the private sector? 

Melanie Teplinsky: Right. So what's been going on with SMEs, these small, medium - small and medium enterprises - -sized enterprises? They're basically small businesses. And the problem is that, of course, they are not in a position to defend themselves against these kind of nation-state threats on their own. So there's the Colonial attack, the Microsoft Exchange attack, the SolarWinds attack. None of these are the kinds of things that our small and medium-sized businesses can themselves protect against. And we've seen this particularly in the defense industrial base, where even the Navy's own Readiness Review found that the Navy is hemorrhaging critical data as a result of cyber theft. 

Melanie Teplinsky: And so what we were looking for was a way to address the problem of small companies that are extremely important, whether to our national security because they're in the defense industrial base or whether they're important to our national innovation base because they're working on emerging and foundational technologies, like artificial intelligence or biotech or pharma. And so our proposal was to try to spur the development of an entire industry of cybersecurity providers that would have expertise and would be able to provide the kinds of services that these SMEs and - small and medium-sized businesses and defense industrial-based companies would need in order to protect themselves successfully. 

Ben Yelin: So I'm going to come up with an analogy here, and you can tell me if it's way off base. Is this sort of like a cybersecurity first responder? So the way that employees at small and medium-sized businesses don't know - might not know how to perform CPR or might not know how to fight fires or stop intruders, is it the same sort of vision for these subject matter experts? 

Melanie Teplinsky: So I would say, while first responders are necessary, this is more of a left-of-boom approach, meaning it's before the attack. It's what kinds of cybersecurity protections do we have to have in place in order to protect our companies. Does that make sense? 

Ben Yelin: Yeah. So it's almost more the role of an emergency planner, emergency - extending what's admittedly a poor metaphor, but, like, a emergency planner rather than the first responder. 

Melanie Teplinsky: Exactly. Exactly. And it's recognizing in part that right now there are a lot of cybersecurity capabilities available, but there aren't integrated solutions. So it's like you are asking someone to go out and buy a car and saying, but what you need to do is buy the wheels and the engine and the steering wheel and figure out how to put the car together yourself. And that's what these small companies are trying to do when they're trying to craft a cybersecurity program. So our approach has been to say we need to figure out how to spur the development of integrated solutions that would allow companies to buy what they need. 

Ben Yelin: And could you expand a little bit on the role that the zero-trust approach has played in the issues that you've identified in this new paradigm? 

Melanie Teplinsky: Certainly. So the traditional approach to security is perimeter security. And this is a model that's all about keeping the bad guys out of your network. The idea is you post a guard at the entrance to your building. And then, once someone is let into the building, they're trusted, and they're allowed to go anywhere in the building that they want. And zero-trust is essentially an alternative to that traditional model. The idea behind zero-trust is that you would - instead of posting just a guard at the gate, you'd post a guard at the entrance to the building and then at every door, hallway and elevator. And this model assumes the bad guys have breached the network, and it takes a deny-by-default approach to protecting the critical assets that are inside our companies. 

Ben Yelin: So is one of the problem areas that most small- and medium-sized businesses just don't have the institutional expertise to implement this type of zero-trust approach? 

Melanie Teplinsky: Precisely. They lack the expertise. Some companies are not aware that it's a problem. Those that are aware don't have the time, the resources or the expertise to go ahead and implement these kinds of programs. Zero-trust is not something you can buy off the shelf. It's not something that you can purchase. You need to develop a program. It takes time and effort and investment. 

Ben Yelin: So on that question of investment, what could you see in terms of state policy changes or federal policy changes that would encourage the development of this new paradigm that you've identified? 

Melanie Teplinsky: So what we were proposing in our op-ed was a new paradigm which would be based on transferable investment tax credits. The idea would be that Congress would establish a tax credit for qualified companies that are relying on expert cybersecurity providers. The credits, like any other credit, would reduce your taxes on a dollar-for-dollar basis. So the idea would be, just as in the energy industry, where Congress relies on energy investment tax credits to incentivize investment in fuel cells, small wind turbines and these other kinds of technologies that it wants to support - similarly, it would incentivize investment in expert cybersecurity services. 

Ben Yelin: That's sort of the carrot approach. Have you at least considered or thought about if investment was not as much of an incentive? From an ideal policy development perspective, have you thought about what a sticks approach might be, and do you think that would be workable? And if not, why not? 

Melanie Teplinsky: Certainly. So if you go back to about 2011, 2012, there was a bill called the Lieberman-Collins bill. And that bill had introduced the concept of mandatory cybersecurity standards in this country. They would have been federal standards and set a floor for cybersecurity across industries. That bill, despite efforts by, at that time, the Obama administration for passage, including President Obama actually penning a rare op-ed in The Wall Street Journal in support of it - that bill did not go through Congress. It was unsuccessful. And that bill was written in response to the thought that we had a market failure and that cybersecurity companies on their own were not providing enough cybersecurity to protect against the kinds of threats that we were seeing. 

Melanie Teplinsky: Fast forward now - we're expecting - imminently, actually - the White House to release the National Cyber Strategy. And certainly, in the strategy, we do expect that there will be a shift. In particular, there's going to be a shift, we expect, away from market reliance in this space and towards some kind of federal regulatory role. So there has been already work in the administration. The administration has developed some sector-specific requirements applicable in rail transportation, pipeline post-Colonial, et cetera. So there has been work and movement toward a more federal regulatory approach. But as of yet, we don't have congressional legislation that requires federal cybersecurity standards across the board. And even on a sector-specific basis, we only have a handful of requirements. 

Ben Yelin: So one of the themes of our podcast is we have a lot of people who call for congressional action. We spoke a lot last year, for example, about data privacy. There were a series of promising pieces of legislation to establish national data privacy standards. It seemed that it was really promising. It was on the verge of passing, and then it died in the lame duck session, as many pieces of legislation do. So what would be an effective message to get this type of policy change to the top of the pile, where you can actually effectuate change and not have it get lost in the morass of a polarized and dysfunctional Congress? 

Melanie Teplinsky: Right. So one of the interesting things this week was certainly we saw in the news that China was flying a surveillance spy balloon above the United States. And in the community of which I am a part, the reaction was, why is everyone getting so exercised about this when, in fact, China's been... 

Ben Yelin: All your kids are on TikTok, yeah. 

Melanie Teplinsky: Exactly. Right? There's been a long campaign of rampant cyber espionage against the United States. And so I think right now there's an increasing understanding that this rampant espionage really reflects the understanding that economic power is key to national power. And you're seeing authoritarian governments like China using their intelligence services to support their privately owned companies because they view those companies as an extension of the Chinese state. So as the administration shifts in its international relations to a greater understanding of what folks are referring to as the great power competition with China, I think there will be more of a push to try to solve these problems. And our proposal to implement an incentive system, a cybersecurity investment tax credit for purposes of encouraging companies to - particularly defense industrial-based companies, to try to up their cybersecurity game, I think those proposals will become much more attractive given the administration's current goals. 

Dave Bittner: All right... 

Ben Yelin: What did you think, Dave? I love when I get to ask you that question. 

Dave Bittner: (Laughter) I know, right? I'm usually the one asking you. I thought it was really interesting. And, you know, I love my analogies, just like you do, and part of - in the early part of this conversation, you were kind of, you know, straining to find a good analogy. And I was - as I was listening along, I was doing the same thing. I was thinking, like, could it be the Army Corps of Engineers, you know? I was thinking - because Melanie said, you want to be left of boom; you want to be preventative, not, you know, reparative, I guess. But... 

Ben Yelin: Right, which is why she rejected my firefighters metaphor, I think with good reason. 

Dave Bittner: Well, so I was thinking, like, is this - could the argument be you have a water treatment plant so you don't need so big a hospital? Right? 

Ben Yelin: I see. So the preventative - yeah. Right. 

Dave Bittner: (Laughter) Right? Right. 

Ben Yelin: The preventative action is cleaning up the water so that people aren't dying of dysentery. 

Dave Bittner: Right, exactly. 

Ben Yelin: Yeah, yeah 

Dave Bittner: Exactly. That sort of thing. But, I mean, that aside, I think the point is well made that particularly our small and medium-sized businesses are kind of on their own in a way here. And I really do see the utility of having support for them. I would love to see this happen in a way that they don't have to think about it. 

Ben Yelin: Right. 

Dave Bittner: Kind of like the way we generally don't think about the safety of our water. 

Ben Yelin: Right. 

Dave Bittner: We don't think about national defense. 

Ben Yelin: Right. 

Dave Bittner: You know, we - these things happen. 

Ben Yelin: You just trust that somebody else out there is taking care of it. 

Dave Bittner: Right. 

Ben Yelin: And we don't yet have that in the cyber world. 

Dave Bittner: Correct. 

Ben Yelin: I hope we get there. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: And I think it'll be interesting to see because so much of the available defense so far has come from the private sector. And so - in cyber. So can you envision a national antivirus tool or something, you know, (laughter) something publicly - something for the public good? Or would the government simply provide funding to provide people who cannot afford one, you know, a private solution... 

Ben Yelin: Right. 

Dave Bittner: ...At no cost to them? 

Ben Yelin: Right. 

Dave Bittner: Which are both - you know, there's viable ways of coming at this. 

Ben Yelin: Adds new meaning to a public defender, right? 

Dave Bittner: There you go (laughter). 

Ben Yelin: The government - if you can't afford cyber insurance, the government will appoint one for you. 

Dave Bittner: Yeah. Right. 

Ben Yelin: Watch your head as we get you into this police car. Yeah. 

Dave Bittner: Right. Right, right. But, I mean, there's - you could see there being some sort of regulatory framework where if you are an ISP, if you're providing internet to people, you must - in the same way that you must provide - if you're providing water or any beverage or any foodstuff to people, you have to meet certain standards. And so could there be a regulatory framework where if you're an ISP, you have to meet a certain standard of security for your... 

Ben Yelin: Right. 

Dave Bittner: And it's not merely the free market determining that, but that it is, you know, put in place by the government. 

Ben Yelin: Yeah. 

Dave Bittner: It's an interesting thing to ponder. 

Ben Yelin: Yeah. 

Dave Bittner: So all right, well, our thanks to Melanie Teplinsky for joining us. Again, she is a senior fellow at American University's Washington College of Law. And we do appreciate her taking the time for us. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.