Caveat 12.9.20
Ep 57 | 12.9.20

A pandemic is a perfect environment for disinformation to thrive.

Transcript

Nina Jankowicz: We're seeing that especially during the coronavirus pandemic, people are fearful. People are emotional. There's a lot of uncertainty. And that is a perfect environment for disinformation to thrive.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben looks at the U.S. Supreme Court's potential interpretations of the Computer Fraud and Abuse Act. I've got a report out of the U.K. on AI algorithm oversight. And later in the show, my conversation with Nina Jankowicz. She's a disinformation fellow from the Wilson Center. And we're going to be discussing her book, "How To Lose the Information War: Russia, Fake News, and the Future of Conflict." While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we got some good stories this week. Why don't you kick things off for us? 

Ben Yelin: So I did what every good lawyer would do and spent much of my Monday morning listening to oral arguments from the Supreme Court on C-SPAN, where all you can see is the image of the face of the attorneys or the justices. But I did it for the - exciting stuff. 

Dave Bittner: Exciting, sexy life, don't you, Ben? 

Ben Yelin: Absolutely. But I did it for the sake of our loyal listeners. 

Dave Bittner: (Laughter) I see - so they don't have to. 

Ben Yelin: I know I don't have to justify it to you - or them, to be honest. 

Dave Bittner: Right. 

Ben Yelin: I embrace the nerdom. So, yeah, this was a very important Supreme Court argument in a case called Van Buren v. the United States. It's really the first case dealing with the Computer Fraud and Abuse Act that's made its way up to the U.S. Supreme Court. And that issue is the scope of the act. How much activity does that act actually prohibit? 

Ben Yelin: So a little bit of very brief background - most of our listeners are certainly aware of this. But the Computer Fraud and Abuse Act is a federal law that imposes civil and criminal penalties for unauthorized access to computers. So it was drafted in the 1980s to prevent hacking. Where we get into some legally questionable territory is it also allows unauthorized access to certain material, even when a person has access to a machine. So if you exceed your authorized access to a computer or to a network, even if you're authorized to be on that network in the first place, that could potentially expose you to criminal or civil liability. 

Ben Yelin: And that's what happened here. There was a guy in Georgia, a police officer named Nathan Van Buren, who was authorized to search computerized records about license plates because he's a law enforcement officer. But he was requested by an FBI informant - and by requested, I mean, pretty much bribed - to search those records for private purposes, which is what he did. And the government charged him criminally, and he was convicted. He appealed, saying that this is an overbroad interpretation of the Computer Fraud and Abuse Act, and it made its way all the way up to the Supreme Court. 

Ben Yelin: So the nature of the disagreement here is, how far does this act extend? What Mr. Van Buren's attorney tried to argue is that according to the government's interpretation - broad interpretation of the law, we would be criminalizing all sorts of innocuous normal behavior, like accessing Facebook on your work computer or potentially putting false information in a dating website. That was actually one of the things that was mentioned at this oral argument. So the attorney in this case, Jeffrey Fisher, started out his argument with what the justices called a parade of horribles, which is... 

Dave Bittner: (Laughter). 

Ben Yelin: ...The slippery slope of all the things that would be criminalized in the internet age if you had this very broad interpretation of unauthorized access. The government, on the other hand, was arguing that if we don't criminalize this type of behavior, that has its own slippery slope, as well, because then you'd have, you know, government officials who have access to personnel files. Maybe they'd go in and check somebody's private health information. And if, you know, we didn't have a looser interpretation of the Computer Fraud and Abuse Act, perhaps that would not be criminal behavior. And so that's something that would also be of concern. So it was sort of both sides trying to present what the slippery slope would be if the other side's interpretation of the statute were to take hold. 

Ben Yelin: So a couple of impressions from the oral arguments - my sense is that most of the justices were more amenable to the arguments of Van Buren and his attorney that we can't have an overbroad definition of the statute because we'd be criminalizing things that all of us do where we exceed our access. You know, maybe we're on our employer's network, and we decide to do some online shopping. Is that going to subject all of us to jail time or massive civil fines? You know, I don't think that would be good for any of us. Particularly Justices Sotomayor and Gorsuch - certainly not ideologically similar - both seemed to buy into that argument to a degree. 

Ben Yelin: I think there are some justices, particularly Justice Alito, who might have been more amenable to the government's arguments. He did say that this was a very difficult case because, well, the attorney for Van Buren was saying there are other ways to criminalize the government employee looking at his ex-girlfriend's health records. Justice Alito was saying, you know, I don't really know what those authorities are. So I'm sure they're out there, but, you know, it probably depends on what the relevant state laws are and how you interpret other federal statutes. 

Ben Yelin: So if I had to give a guess on - as to how this case was going to turn out, which is always a dangerous thing to do, I think Mr. Van Buren is going to win. I think, generally, courts apply what's called a rule of lenity, which means that when a statute is unclear and ambiguous, you try and come down on the side of the criminal defendant because we just don't want everybody going to jail... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Which I think is a very reasonable canon of statutory construction. That's my read of the case. I know I've gone on for a while now. 

Dave Bittner: Now, I have a couple questions for you. So one thing that's come up in conversations about this is this notion of scraping, which is when, for example, a site like Facebook has a lot of information that's publicly facing, that's accessible, and someone comes in and programs their own system, sets up a bot or some kind of automated system to basically vacuum up that information. And there's an argument that that's in violation of the EULA, of the terms and agreements that you agree to when you sign up to be a user of this site, and so that falls under the Computer Fraud and Abuse Act. Did that come up in these arguments? 

Ben Yelin: It did not. I did not hear any discussion about scraping. I mean, there are a lot of discussions - you know, remember, we're talking about Supreme Court justices here. So they're not (laughter) - they're not really technologically advanced. But there was a lot of discussion about just individual users on major social media platforms - so not related to the activities of the platforms themselves, but individuals like Van Buren who are being subject to the full force of the law when they were doing something that many of us would consider to be relatively innocuous. So, yeah, there was less of a focus on regulating the behavior, at least in this case, of big tech companies. I think that's just because, you know, the nature of this case is it was about one individual. 

Dave Bittner: I see. 

Ben Yelin: It wasn't about the behavior of Facebook and Instagram. So I think the Supreme Court is somewhat confined to the particular facts of the case. 

Dave Bittner: So the other question for you - as I was reading commentary on this yesterday in The Wall Street Journal, and it struck me, they said, a decision on this case is expected by June. Well, that struck me. I wondered, what's going to take so long? (Laughter). 

Ben Yelin: So that's a good question. I'm not sure it will take all the way until June. June is when the current term of the Supreme Court ends, which means they have to make a decision by then. 

Dave Bittner: OK. 

Ben Yelin: It's hard to game out the timeline of Supreme Court decisions. Generally, ones that are more controversial probably take a little longer because, you know, the justice who's in the majority has to write an opinion, and then the justice in the minority has to write a dissent, and then the opinion has to respond to the dissent. So that generally takes a little longer. So if there is some disagreement here, I think that could move the timeline back a little bit. I would guess that we're going to get a decision well before June. My - if I had to ballpark it, I would say it's probably like a February or March type of thing, which is usually the turnaround time. So we should have some clarity by then. 

Dave Bittner: OK. What do you think the broad reach of this is? Could this retroactively affect things? What sort of thing - and then going forward, what do we see coming out? If they rule for Van Buren, what happens next? 

Ben Yelin: So if they rule for Van Buren, that doesn't mean that everybody who's previously been convicted under a broad interpretation of the statute suddenly gets out of jail (laughter) because we have a good faith exception. If law enforcement, when they did the prosecution, were relying on a good faith interpretation of the law, then that argument is going to stand. Mr. Van Buren himself would have his conviction vacated, and there might be some other circumstances where people who have been convicted either through federal petitions or through appeals try and use the justification of this case to get out of their sentences. 

Ben Yelin: You know, the thing is, not that many people have been criminally punished the way Van Buren does for unauthorized access to a system. That's one of the things that the justices actually mentioned here is - and it's something that was invoked by the attorney from the government. It's like - it's not that the government is seeking out all types of federal cases to punish grandma when she goes on Facebook, which, you know, exceeds the authorization by her employer. I don't think that's, you know, what federal prosecutors are out to do. That's not their intention. What Supreme Court justices were saying and what the defense attorney said here - Mr. Van Buren's attorney - is that's not the way you should judge laws. Laws should be judged on their face, and, you know, you should try to deduce the proper interpretation of the law, no matter how it is being enforced because eventually, there could be a Justice Department that decides - you know what? - I want to make it a priority to bust every single person who exceeds authorized access. So, you know, I think there was a lot of truth in that argument, as well. 

Dave Bittner: Yeah. That's fascinating to me, too, because it strikes me that the way things work, you know, out in the real world, there are some laws that are proactive and some laws that are reactive. You know, like you say, there are some where we go out looking for folks who might be up to no good, but then there's other ones where it's good to have this on the books just in case we need it. But we're not necessarily out there looking for - like you say, to bust people on things. 

Ben Yelin: Right. And I think that's the point that the Department of Justice attorney was making here is it's very rare that we're going to have federal prosecutions based on the type of behavior being described by the defense attorney in this so-called parade of horribles. And, you know, I think to a certain degree, that's a reasonable point. You know, slippery slopes often don't come true, and I think it's important to point that out. 

Ben Yelin: But other side of that is you would still be giving this weapon to federal law enforcement. And based on what Justice Gorsuch said - and he's a conservative justice - he just thinks there are too many laws on the books that give U.S. Department of Justice the ability to criminally prosecute people. And this is just another one of those statutes. You know, and in his mind - this is an exact quote from Justice Gorsuch - "This is the latest in a rather long line of cases in which the government has consistently sought to expand federal criminal jurisdiction in pretty significantly contestable ways that this court has rejected." 

Ben Yelin: And I think what he's essentially saying there is if we have federal criminal statutes that get overbroad, we're just going to have a greater number of people going into the criminal justice system, into the federal criminal justice system for crimes that, frankly, aren't worthy of the Department of Justice's time or resources. 

Ben Yelin: So, you know, I think - to me, that point is very well taken. And it perhaps gives some indication as to how the justices are leaning here because if they did take this very broad interpretation of the statute and, you know, the Department of Justice decided to prioritize this, a lot of people potentially could get prosecuted, and we'd start to overcrowd our federal prisons. And when you combine this with a lot of other federal statutes that introduce criminal penalties, you start to kind of get into a situation where you're overwhelming our federal criminal justice system at a time when there's sort of been a bipartisan movement to reverse those trends. So, you know, I thought that was a really interesting point that he made in oral arguments. 

Dave Bittner: Yeah, very much so. All right. Well, time will tell on this one - certainly interesting indeed. 

Dave Bittner: My story this week, this comes from ZDNet. It's actually a story coming out of the U.K. So perhaps in that case it's ZedDNet (ph). 

Ben Yelin: Oh - nicely done, sir. 

Dave Bittner: Thank you. Thank you very much (laughter). So it's an article by Daphne Leprince-Ringuet, I suppose. Again, if she's European, it might be French. I don't know. Daphne, I hope I got that right. Apologies if I did not. And the article is titled "The Algorithms are Watching Us, but Who Is Watching the Algorithms?" And this involves a two-year investigation that the U.K.'s Center for Data Ethics and Innovation did, looking at artificial intelligence and how it's being used and whether it's being used properly and what sort of guardrails might need to be put on this process. It addresses a lot of the things that we've discussed here. Of Course, policing comes up. They talk about how the Met Police over in the U.K., they used a tool called Gangs Matrix to identify folks at risk of engaging with gang violence. But they found that it was based on out-of-date technology and - wait for it - it disproportionately featured young Black men. 

Ben Yelin: I am shocked - shocked. 

Dave Bittner: Right? Yes. (Laughter) So... 

Ben Yelin: How dare they be serving alcohol in this establishment? Yeah. 

Dave Bittner: Right. So activists voiced concerns, and they overhauled the database to reduce the representation of individuals from Black African Caribbean backgrounds. That keeps popping up time and time again that these systems, they have, for whatever reason - and that reason seems to be the data we're putting into them - they have these biases. Also, I think just there's something creepy about predictive policing. You know? (Laughter) 

Ben Yelin: Yes, there certainly is, yeah. I mean, it goes against our notions of justice to try and predict criminal behavior before it actually happens. You know, that is sort of counter to our values of things like the presumption of innocence. So yeah, yeah. 

Dave Bittner: Right. 

Ben Yelin: I certainly see that as problematic. 

Dave Bittner: Yeah. One of the points that this article makes is how important it is to build trust with the community as they're deploying these systems that use artificial intelligence. And they said that up to 60% of citizens oppose the use of AI-infused decision making in the criminal justice system. And the vast majority of respondents - they said 83% were not even certain how systems are used in the police force in the first place, which seems like there's an information gap there. They need to do a better job of educating the general public as to what's going on here. Very interesting. What's your take on this, Ben? 

Ben Yelin: Yeah, I'll first talk about the public opinion. And you've heard me say this before. I'm always skeptical of polling like this just because maybe 95% of the people who answered this poll really have no idea what AI or how it's used. 

Dave Bittner: Yeah. 

Ben Yelin: And it's very prone to changes in, like, question wording, how you ask the question. 

Dave Bittner: Right. 

Ben Yelin: Like, I just don't think people have very well-formed opinions on the use of AI-infused decision making. So just a note of caution there. I think the point about earning the public's trust is still incredibly important, no matter what the actual numbers are. If the public starts to feel that this tool is causing arbitrary arrests and particularly that it's racially biased, then there's going to be a public backlash. There's going to be a lack of trust. It could further break down relationships between the public and law enforcement, which in this country are obviously quite strained. So yeah, I mean, and I think that really is incumbent upon both the law enforcement agencies that decide to use this technology and the policymakers that are allowing it to be used and, in some cases, providing the funding that allows AI-infused decision making to be used. It can be a very effective tool. I think that's the point this article made. Algorithms in their minds can even actually identify our own biases, which would be extremely useful. But the fact that we're not there yet, and they are still showing these levels of racial biases means that, I think, we really need to slow down. Come up with governance bodies both in our country and the United Kingdom that really evaluate how this is being used. You know, have independent review boards testifying to particular algorithms to make sure that they, you know, are not discriminating against particular groups and really build that public trust. And I think there's going to be a lot of cases that are thrown out because the evidence was obtained using AI-infused decision making. And we now have articles like this and many others that we've discussed that cast doubt on the accuracy of that method. 

Dave Bittner: Yeah. 

Ben Yelin: So it could end up hurting, you know, law enforcement maybe more than it ends up helping law enforcement in the short run, at least. 

Dave Bittner: Well, and I'll get back on my soapbox here and say that... 

Ben Yelin: Get on that soapbox. 

Dave Bittner: (Laughter) ...Say that more and more, I wonder if we need some equivalent to the FDA for algorithms, for for social media, for these sorts of things. In other words, before you're allowed to deploy something like this, just like before you're allowed to deploy a prescription medicine, it has to be tested. It has to be vetted. You have to prove to the government and through the government to the general public that, first, this will do no harm. 

Ben Yelin: It seems like you're kind of angling to be appointed to this position. 

Dave Bittner: (Laughter) Oh, yeah. That's what they need - is me (laughter). 

Ben Yelin: Yup, they need you. I might have to run this by the president-elect. 

Dave Bittner: Right. Yeah. Yeah. Well, you know... 

Ben Yelin: You're qualified. 

Dave Bittner: (Laughter) If asked, I will gladly serve, but I don't think that'll happen. 

Ben Yelin: Yeah. 

Dave Bittner: (Laughter). Is there anything to that notion? 

Ben Yelin: In terms of whether it's realistic, I don't know. I think that's something that perhaps more broadly would happen at the state and local level. In terms of as an idea, I mean, you've mentioned this sort of FDA-style agency in a number of different contexts. 

Dave Bittner: Yeah. 

Ben Yelin: You mentioned it in the context of social media. I think it's less likely there than it is here because this is something where, you know, you do have some extent of control. It's not like the federal government has that much agency when it comes to state and local police departments, but they do have some because these departments have to follow the U.S. Constitution. 

Dave Bittner: Right. 

Ben Yelin: And, you know, it's just a more limited universe of people who would be using this technology as opposed to social media, which is really hard to regulate. I think that's - this idea in the context of AI is actually more promising. And so I think that's something that would be a really interesting policy goal. I'm always skeptical that the federal government is going to do the right thing. You've probably gleaned that from my commentary. So... 

Dave Bittner: (Laughter) Yes. Yes. 

Ben Yelin: ...I don't - you know, I don't see that as something that's going to be adopted quickly. I think this would be the perfect type of situation where you had an independent review board, people who are experts in technology and then people who are experts in social science, people who have studied racial bias in other contexts. Get on a board and try to review these algorithms. Open up the black box and see what's inside. I think that actually might be the most sustainable solution here. 

Dave Bittner: Yeah. 

Ben Yelin: So for once, I'm supporting you on your soapbox. 

Dave Bittner: (Laughter) That's very diplomatic of you, Ben. I appreciate it. 

Ben Yelin: Absolutely. 

Dave Bittner: We'll be sure to have you back next week on the show. So... 

Ben Yelin: Yes. 

Dave Bittner: (Laughter). All right, well, those are our stories this week. We would love to hear from you. If you have a question for Ben or for me, you can call in and leave a message. It's 410-618-3720. That's our number. You can also send us email. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Nina Jankowicz. She's the disinformation fellow at the Wilson Center. And she has a recently published book. It's titled "How to Lose the Information War: Russia, Fake News and the Future of Conflict." Here's my conversation with Nina Jankowicz. 

Nina Jankowicz: I was living in Ukraine in 2016 and 2017 when the US election was happening and all the revelations about Russian interference in the election came to light. And I was working as a strategic communications advisor to the Foreign Ministry of Ukraine under the auspices of a Fulbright Fellowship. And being there basically on the front lines of the information war - you know, Ukraine has been dealing with this stuff more in a concentrated way since 2014 and 2013, when the Euromaidan Revolution began, and Russia illegally annexed the Crimean Peninsula and invaded eastern Ukraine, the Donbass. So they're very familiar with these tactics, as are a lot of other central and Eastern European nations and the Baltic states, places like Poland, the Czech Republic. And I just felt that, you know, watching the U.S. response, which was really categorized by a lot of hubris - you know, it was a lot of, how could this have happened to us? - when things like this had been happening in central and Eastern Europe for the past 10 to 15 years. I really felt that there was a lot that we, the United States, could learn from our allies in central and Eastern Europe. And that's what the book looks at, five different central and Eastern European countries - Estonia, the Republic of Georgia, Ukraine, Czech Republic and Poland - and how they responded to the threat of Russian disinformation and increasingly to the threat of domestic disinformation, as well. 

Dave Bittner: Well, take us through what you've outlined here. I mean, what were some of the key ways that these nations dealt with this issue of Russian information operations? 

Nina Jankowicz: So one of the most important things is that they all recognize that it's a problem, which I don't think that we can say for the United States, frankly. I mean, I did a hearing a couple weeks ago for the House Intelligence Committee. And only the Democrats showed up. It was a hearing on disinformation and conspiracy theories ahead of the election. And the Republicans just did not deign to to make an appearance. And that's very saddening to me because I've briefed Republicans on the Hill before. They care about these issues, but it has become so politicized to even talk about disinformation, particularly in the context of Russia. And that leaves us vulnerable, frankly. Over the past four years, we've done very little to raise the costs for actors like Russia who are using disinformation to achieve their policy goals, to affect and influence our political conversations. And the fact that we're allowing it to be politicized and not even addressing the lowest-hanging fruit in terms of dealing with the problem, like transparency around political ads and mandating that through Congress, just shows how difficult this problem is to solve when you don't recognize that it's a problem. So that's the first thing. All the countries do that. 

Nina Jankowicz: They also invest in people. They recognize that we can't solve this problem just by deleting offending accounts. We have to invest in healing the fissures in our society that make disinformation so palatable, so appealing to many parts of the population in the first place. And they do that through things like media literacy and digital literacy, investing in public media journalism as a public good, which - you know, when we look at our media landscape here in the States, there are very few outlets that are, you know, incredibly highly trusted, particularly in the for-profit media space. There's a little bit better trust in public media, but they're severely underfunded, at something like $1.35 per person per year in America that we spend on the Corporation for Public Broadcasting, which is - it pales in comparison, of course, to many other developed nations, which are more resilient to disinformation. So those things are really important. And then finally, the third lesson is that, you know, there's homegrown disinformation. And frankly, our foreign adversaries use those homegrown vectors in order to launder their narratives into the legitimate American information ecosystem. And so to really crack down on foreign disinformation, we have to recognize and disincentivize the fact that it's being used domestically by fringe political actors as well. 

Dave Bittner: It strikes me that - I think for a lot of us who have been watching these sorts of things unfold, you know, for those of us who consider ourselves to be taking an evidence-based approach to life and to politics and these sorts of things, I mean, I think personally, like, I find myself hearing some of these things that make their way through the media, you know, things about - well, the famous example of the, you know - the pizza shop and the pedophiles in the basement of a pizza shop. And I'm left scratching my head and thinking to myself, well, how could anyone believe that? And yet, time after time, we see these things that they do gain traction, and they get amplified, and they make their way through certain communities like a forest fire. 

Nina Jankowicz: Yeah. Well, and we're seeing that especially during the coronavirus pandemic. People are fearful. People are emotional. There's a lot of uncertainty. And that is a perfect environment for disinformation to thrive. All of those things contribute to the emotional manipulation or the ease of emotional manipulation that disinformation uses as a hallmark. And I think that's one of the big misconceptions about how the entire phenomenon works, right? People think it's something that can be easily spotted - a troll account with an egg on Twitter and a bunch of, you know, letters and numbers as its handle or a bad Photoshop job or a clearly fake news website. 

Nina Jankowicz: Actually, disinformation is rarely cut-and-dry fakes like that. It is more often than not playing on real grievances in society and our emotions. And that's why things like the QAnon conspiracy theory or Pizzagate, which is a subsidiary of that or perhaps the seed that started QAnon, have gained so much traction - because people are trying to make sense of this crazy world around them. And frankly, those narratives that are emotional, that appeal to their base instincts are more appealing than the actual, fairly boring, nuanced explanation of how the world is working right now. 

Nina Jankowicz: And again, seeing more of these conspiracy theories being latched onto during the pandemic. And that's partly because of the incentive structure on social media that allow this to happen. The most engaging content is often the most enraging content, and the social media platforms don't really have an incentive to change that incentive. Right? That's what gives them the information that keeps eyeballs on ads and funds their business model. 

Nina Jankowicz: And so I think we really need somebody to step in and change those incentive structures around, No. 1, but also look out for users not necessarily, you know, actually practicing content moderation - I don't want the government involved in that. But I do want the government involved in, you know, seeing if the platforms are doing their due diligence, not just taking them at their word that they're taking down foreign influence operations or content that has an effect on public safety, terrorist content. You know, we've just seen an online organized attempt to kidnap the governor of Michigan taken down. That sort of stuff is all thriving on social media right now. And that's why I think we need a little bit more oversight and transparency from an expert, nonpartisan third party. And that's the ideal function for government. 

Dave Bittner: Yeah, I can't help wondering, do we need some sort of, you know, online version of the EPA, you know? I'm thinking of, you know - if a factory is polluting a local river, well, the - it's going to come to a point, we hope, where the government will come in and say, we tested the water here, and you're above the legal levels. Knock it off. 

Nina Jankowicz: Yeah. 

Dave Bittner: And we don't seem to have that when it comes to these online social media companies. 

Nina Jankowicz: Yeah, you're exactly right. And in fact, the governance of these platforms has been kind of fragmented and left, in certain cases, to, you know, the Federal Communications Commission when it's dealing with things that are broadly in the FCC's remit. We've seen the FTC deal in antitrust issues. We've seen the Federal Elections Commission deal with election-related issues. But nobody has fully taken the reins. And this set of issues is quite complex. It touches a lot of different things. It touches privacy law. It touches First Amendment rights. It touches national security. And I think you need people who are really expert in the way that the internet operates to understand and make those rules and provide that oversight. 

Nina Jankowicz: We don't even have a congressional tech office. There used to be an office for technology assessment in Congress that was really a great font of knowledge for members and their staffs that worked in a similar way that the Congressional Research Service does. But that was done away with in the '90s. And I would really love to see that return so that we don't have any repeat hearings like we've seen whenever the tech executives are up on the Hill and the representatives are just asking questions that are obviously, you know, not grounded in an understanding of how these platforms work. 

Dave Bittner: Right, right. 

Nina Jankowicz: And also, it would allow them to kind of stay ahead of the trends in terms of technology. Tech moves fast. I'm sure I don't need to tell this audience that. And if we had real experts who were informing our policymakers, I think rather than being behind the curve, we could at least approach the curve. I'm not sure if we can ever get ahead of the curve (laughter). But at least we'd be closer to it. 

Dave Bittner: You know, one of the things that you point out in the book, which is fascinating, is these nations that you describe here - their efforts against Russia - they all failed. And I'm reminded of that line from the movie "The Princess Bride." You know, you've fallen for one of history's great blunders, which is never get involved in a land war in Asia. 

Nina Jankowicz: (Laughter). 

Dave Bittner: I mean, is it one of history's great blunders to get involved in an information operation war against Russia? 

Nina Jankowicz: Well, I think we've let Russia get the upper hand to some extent because we refused to acknowledge the signs that this was becoming a reality of today's information environment. It is what me and a co-author have called enduring information vigilance and perpetual information competition when we were writing for a paper for an audience that was a military audience, which probably isn't surprising giving those terms. 

Dave Bittner: (Laughter). 

Nina Jankowicz: But we need to understand that this is going to be constant now. And we need to set up our government, our civil society, our media to respond to it. And I'm not saying we need to get in, you know, hand-to-hand combat or the equivalent of that online. But we do need to shore up our offense in terms of building up societal responses, people's reflexes to the types of, you know, dirty tricks we're seeing online. I think so many individuals are just not aware of the many ways that you can be duped. Not only - you know, we've just seen, recently, a DOJ indictment of Russian hackers who were responsible for hacking operations and, in some cases, hack-and-leak operations on several different continents - in the U.K., in the Republic of Georgia, in Ukraine and here in the United States, as well as the Olympics in South Korea in Pyongyang (ph). 

Nina Jankowicz: Clearly, this is part of Russia's toolkit now, and it's not just in the cyber domain. It affects the information domain as well. So we really need to think about building those more resilient societies, understanding - or allowing people to understand that this is a reality and they need to protect themselves, not only in the cyber domain - making sure that, you know, their own digital security and IT hygiene is up to snuff - using a password manager, two-factor, et cetera, et cetera, understanding how phishing works - but also understanding the ways that they can be targeted with information, which is not something that I think most people recognize. 

Nina Jankowicz: You know, we've all developed the reflexes where we know to hang up on somebody who's asking for our Social Security number in a phone scam or to delete an email from someone who claims to be a Nigerian prince and is going to give us a million dollars. But we've not developed those same reflexes for the social media environment. And that's one thing that I think, you know, we can start investing more in. It's going to be a generational response. But the countries that are, you know, making some headway in fighting Russia's information war in my book are all countries that have really started investing in those human responses. 

Dave Bittner: What sorts of price do you think we could make Russia start to pay? If we find ourselves with an administration who decides to come after this aggressively, what sort of tools do they have in their toolbox? 

Nina Jankowicz: I am not of the opinion that any sort of redline that we draw or price that we try to make a bad actor pay is going to act as enough of a deterrent. We've seen those sort of redlines drawn in the cybersphere with relation to Russia, North Korea, China, Iran even. And I don't think that they have had too much of a deterrent effect because - and this is even more true in the informational sphere - the return on investment for a successful operation is so high compared to the cost that is sunk in these sorts of operations. And with information operations, you know, you really just need the human investment. You might need to pay for a few ads, but that's becoming decreasingly useful for these groups. It's just about the manpower. 

Nina Jankowicz: And we know, for instance, that the Internet Research Agency spends about $12 million a year. That's not very much when you consider the outsize effect that the amplification of the materials that they put out there in the 2016 election had and certainly the reflective or reflexive image-building that it's done for the Kremlin as well, kind of portraying the Kremlin as this all-powerful information superstar when, in reality, their strategy is a bit more like throwing spaghetti at the wall. 

Nina Jankowicz: At any rate, I think we could have more powerful sanctions against Russia for what it's doing. We could also see, you know, a future president telling Putin, in no uncertain terms, that this is a problem and that we're not going to tolerate it, that we're going to continue calling it out in international circles and not really just take this lying down. What we've seen instead from the Trump administration is kind of a bifurcated policy where on paper, you know, it looks good. We've sanctioned the right people. We've called them out. But in person, instead, we've seen, you know, President Trump joking with President Putin about fake news. He says, you know, don't you meddle in the election? That's a quote that our president... 

Dave Bittner: Right. 

Nina Jankowicz: ...Said to Putin not long ago. So that's a very mixed message and certainly not something that sends the right, you know, signal to Russia that we are serious about this noninterference. So hopefully, with a change of administration or perhaps a change of heart in the Trump administration, we could see a stronger signal sent. But as it is now, that signal is absolutely muddled. 

Dave Bittner: Are you optimistic? Do you think we have a chance at getting control over this to the point where it's, you know, not the issue that it is today? 

Nina Jankowicz: It would be hard for me to get out of bed in the morning if I didn't think we could do something about it. I do think that, you know, there are a lot of things that we haven't even entertained yet. Over the past four years, we really have not seen a good-faith effort by the US government to tackle this problem. We have seen parts of the U.S. government dealing with it, in particular, you know, the folks at the Department of Homeland Security. Cyber and Infrastructure Security Agency have done some really valiant work, but they're a small team, and they're underfunded. There are other similar teams across the government. If we had a united strategy that was bringing together the best brains in, you know, Russia policy, cyber policy, strategic communications in a node in the federal government, I'd feel a lot better. But as it is right now, we don't have that sort of joined-up policy. That's a problem. The politicization of this issue, as I mentioned before, remains an impediment to creating policy at the congressional level. And we've not seen, really, any sort of consensus building in the cross-sector environment - so either between public-private partnership with the social media platforms or bringing in civil society organizations, as well, who are looking out for things like rights to free speech and human rights online. I think there are so many smart people who are working on these issues in the United States that, yes, we can absolutely make a dent. But the reality is that we have been tardy. And our responses have been, in the international realm, tertiary to a lot of what our allies is doing. We are absolutely falling behind and, in some cases, abdicating our responsibility to the rest of the world, as the place that hosts these platforms where so much disinformation spreads, to do something about this. So I think the clock is ticking. And hopefully, we don't tarry too much longer because this is an issue that is getting more concerning by the day. 

Dave Bittner: All right, Ben, interesting conversation, huh? 

Ben Yelin: Yeah, I loved it. She's a great guest on our podcast and I think one of the best interviews we've done, just a very interesting analysis. Obviously, she's one of the top experts in her field. So I really enjoyed hearing her perspective. One thing that that stuck out to me in what she said is generally as a society, we're pretty good at identifying scam phone calls, you know, people who call us and try and ask for money. We're decently good at the most obvious phishing emails, like the Nigerian prince scenario. But we're just not at this point adept in dealing with social media disinformation. And as she said, this is going to be a generational problem. And to be frank, it's more a problem with people who are more advanced in age. And it is something that we really are going to have to confront because it's having a very detrimental effect on our society. And it's not going to be an easy process because a lot of the same people who trust disinformation on social media sites distrust information, perhaps sometimes with good reason, from the mainstream media. So if you have somebody like Nina coming in and saying you can't trust what, you know, you read on that meme - out-of-focus meme you read on Facebook, that sometimes hardens people into trusting that type of information even more. So yeah, these are the types of things that I stay up late worrying about because I just think this is going to be a really difficult problem to solve. But I'm glad we have experts like Nina who are providing us the information that we need to start to address this problem. 

Dave Bittner: Yeah, yeah. I really enjoyed that conversation with her. And we appreciate her taking the time with us. Again, the book is titled "How to Lose the Information War: Russia, Fake News, and the Future of Conflict." A really good read. I highly recommend it. So again, thanks to Nina Jankowicz for joining us. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the start-up studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.