Caveat 9.1.21
Ep 92 | 9.1.21

Taking up the challenge of cyber citizenship.

Transcript

Lisa Guernsey: There were some really big issues to grapple with when it comes to the way students, today's youth, but also adults are taught about how to see and verify what's coming across their screens online.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben shares the story of people in Afghanistan scrubbing their online presence. I've got the story of U.S. government agencies' plans to increase the use of facial recognition technology. And later in the show, my conversation with Peter Singer and Lisa Guernsey on the New America's "Teaching Cyber Citizenship: Bridging Education and National Security to Build Resilience to New Online Threats." While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's jump into our stories this week. Why don't you start things off for us? 

Ben Yelin: My story comes from WIRED. It is written by Chris Stokel-Walker. And frankly, it's kind of depressing. But it is an angle into what's going on in Afghanistan, an angle that's relevant to what we talk about on this show. The title of the article is "Afghans are racing to erase their online lives." And the hook for this article is about a translator who has been assisting the United States and allied countries in Kandahar, just - a city in the south part of Afghanistan. And he's fearing for his life. As of a couple of weeks ago, the Taliban has taken over the country. The U.S. is evacuating through the Kabul airport. And now people like this individual identified here are under a significant threat because, as we know from news reports, the Taliban is going door to door and frankly could be punishing people who had cooperated with the previous regime and also, you know, people who lived lives that go against the Taliban's core tenets. And one of the key pieces of evidence that the Taliban could use is gleaned from social media profiles and other online presences. 

Ben Yelin: So what this article talks about is how this individual and many others are figuring out ways to scrub their online history - going through all of their social media accounts, cutting out pictures that show them, you know, potentially doing things that would not look favorably to the Taliban or Taliban militia members, scrubbing any associations they'd have with non-governmental organizations that are disfavored by the Taliban and anything else that make these individuals a potential target. So the United States, through USAID, our humanitarian arm, has been sending emails to our partners and allies in Afghanistan, actually giving them instructions on how to scrub their online profile. And this has been, you know, translated into the local language there so that, you know, people who are non-English speakers can understand it as well. 

Dave Bittner: Yeah. 

Ben Yelin: I think it's just one of the saddest pieces of evidence we've had of how your social media profile - even if you've tried to scrub your presence to some degree, it has a way of coming back to haunt you. And here it's coming back to haunt people in some very depressing ways... 

Dave Bittner: Yeah. 

Ben Yelin: ...Especially now that the country has been taken over again by the Taliban. 

Dave Bittner: You know, I think folks of my generation - and I'm a little bit older than you (laughter). 

Ben Yelin: Just a little bit. 

Dave Bittner: By a little, I mean a lot. But, you know, we sort of cavalierly joke about how, oh, I'm glad we didn't have, you know, iPhones when we were kids because of all the things we did, and they'd go on our permanent record. And, you know, it's easy to joke about that sort of thing. But here we have an example of where there are real, potential, horrific consequences just from leading your normal everyday life under a certain set of circumstances. And then those circumstances change, and you're - you could be at risk. 

Ben Yelin: Yeah. I mean, it's so heart-wrenching because, you know, for all of the follies of the United States presence in Afghanistan and everything that's happened over the past 20 years, I mean, it has been 20 years since the totalitarian Taliban regime has been in charge, and individuals there enjoyed a level of freedom and freedom of association, for example, with Western entities. And now all of those associations are becoming a significant danger. And we know - you know, this is not just theoretical. This article talks about how the Taliban has used Facebook data to identify individuals who have longstanding relationships with the U.S. military or non-governmental organizations. This is something that's not just done by the Taliban in Afghanistan, but it's a tool employed by some of our other adversaries, like ISIS in Iraq where, you know, ISIS leaders have gone into Facebook and try and sussed out contacts of individuals who are believed to be major political opponents. 

Dave Bittner: Mm hmm. 

Ben Yelin: They've also warned about facial recognition technology and biometrics. And that's something that's really terrifying because, you know, it's one thing to scrub your own online presence. But when we're talking about biometrics and facial recognition, if the Taliban are able to deploy these tools and recognize you from maybe a political protest or, you know, a previous association with a sworn political enemy... 

Dave Bittner: Right. 

Ben Yelin: ...That would kind of prevent you from being able to leave your house in the first place. 

Dave Bittner: Yeah. 

Ben Yelin: Even if you were, you know, willing to subjugate your previous associations, when we're talking about facial recognition and biometrics, it lives on. You can't change your fingerprints, and you can't change what your face looks like. And once they have that data, it can be deployed in ways that's, frankly, pretty dangerous. 

Dave Bittner: Yeah. 

Ben Yelin: So it's really - it's a disturbing story. It's, you know, obviously not the most important angle to what's going on in Afghanistan, but I think it's a window into some of the extremely heart-wrenching choices that people are having to make, people who've lived under, as you said, certain circumstances for the past 20 years. And suddenly, on a Sunday afternoon, the Taliban takes over Kabul, the Afghan president leaves the country on a plane, and suddenly their lives have changed. 

Dave Bittner: Right, right. And there are rumors that the Taliban has accessed biometrics databases that the U.S. forces had as they made their way through. I think the - this transition happens so quickly that, you know, there wasn't time for cleaning things out. You know, you hear the stories about the embassies just frantically burning things... 

Ben Yelin: Right. 

Dave Bittner: ...To not have it fall into the Taliban's hands. We hear stories of folks desperately trying to get to the airport, but their mobile devices are being inspected. 

Ben Yelin: Right, right. Yeah. I mean, I think all of this emanates from how quickly this happened. You know, our own intelligence agencies didn't believe that the Taliban could take over Kabul in this period of time, and probably most of the people who are living in Kabul or, frankly, other parts of Afghanistan thought that, you know, this was going to be gradual. I think there was a general acceptance that the Taliban was winning. They had gained footholds in, you know, a lot of different areas of Afghanistan. And I think it wasn't an inevitability that eventually they would take over the country. But I don't think - I don't think people thought it was going to happen this quickly. So there wasn't sufficient time or preparation to scrub your online presence, to zero out your social media accounts. You didn't have that sufficient warning to give you an impetus to take that action. And I think that's - one of the most tragic aspects of it is all of this has to happen extremely quickly, especially when you know that the Taliban are going door to door in some circumstances. We're hearing all of these horror stories in the news reports. 

Dave Bittner: Yeah. I mean, I don't mean to sound, you know, breathless in response to this, but I suppose it is a cautionary tale. It's an extreme cautionary tale as to thinking about what you share online. And it's - again, it's very easy for all of us to sort of cavalierly say, well, this could never happen to us here. But I'll bet a lot of people in Afghanistan had the same thoughts. 

Ben Yelin: Yeah, I think that's absolutely true. I mean, I don't think we're (laughter) in a situation where we're going to be taken over by a totalitarian regime in the near future. I certainly hope not. 

Dave Bittner: No (laughter). No (laughter). 

Ben Yelin: But it does make you kind of stop and think about, what is it in your vast online footprint that you've accumulated for many years that could be used against you in any future circumstance, even if it's not something as extreme as what we're talking about here? 

Dave Bittner: Right. 

Ben Yelin: But if it's an embarrassing association, you know, something that you've said in the past on social media that's going to reflect poorly on you going forward - this is the most extreme example of how your online footprint is forever, but it's not the only example. 

Dave Bittner: Yeah. 

Ben Yelin: And so I think there is a broader lesson here, even beyond these rather tragic circumstances. 

Dave Bittner: Yeah. All right. Well, we will have a link to that story from the folks at WIRED. 

Dave Bittner: My story this week also touches on facial recognition. This is from the folks over at MIT Technology Review, this article written by Tate Ryan-Mosley. And this is covering a 90-page report that the U.S. Government Accountability Office, the GAO, put out. And this was in response to a request from Congress to sort of survey federal agencies' use of facial recognition during fiscal year 2020. 

Ben Yelin: Spoiler alert - they're using it. 

Dave Bittner: (Laughter) It turns out they're using a lot of it (laughter). 

Ben Yelin: Mm hmm. 

Dave Bittner: So, again, a 90-page report - thanks to the folks at Technology Review for doing the reading for us and summarizing what the report said. But 18 of 24 federal agencies surveyed currently use some form of facial recognition. Departments of Agriculture, Commerce, Defense, Homeland Security, HHS, Interior, Justice, State, Treasury, Veterans Affairs - they all plan to expand their use of facial recognition. It was also interesting that two of the agencies surveyed are using Clearview AI... 

Ben Yelin: Yep. 

Dave Bittner: ...Who we've talked about many times here, the controversial company that vacuumed up all sorts of information here. 

Ben Yelin: Your favorite data-scrapers, yep. 

Dave Bittner: (Laughter) That's right. Both the Air Force and the Fish and Wildlife Service are working on projects with Clearview AI. I guess one of the things that I wonder about in this survey is there is facial recognition and then there's facial recognition, right? So there's facial recognition for letting me breeze through the lobby of the HHS headquarters to head to my office - right? - without having to show my ID. 

Ben Yelin: Oh, that's Dave Bittner. Yeah. You can come in. 

Dave Bittner: Right. Off you go. Have a good day, sir. But then there's facial recognition for law enforcement or, you know, stopping me at the border or all those sorts of things. And I think this survey deals with this in a very sort of broad way. Not a whole lot of specifics as to who's in this report. I haven't read the 90-page report. But do you think that it's fair to say that this deserves a little more nuance than just, you know, X number of agencies are using this? 

Ben Yelin: I do, although I think the fact that it's X number of agencies and those agencies aren't just law enforcement or national security or related agencies, like, I think we'd expect the use of facial recognition for the FBI or for the Department of Homeland Security. Well, maybe even the Department of Treasury in some circumstances if we're talking about, you know, international finance, you're doing an investigation in, you know, whether money has illegally changed hands between parties, maybe you deploy facial recognition software. Would I have anticipated it for the Fish and Wildlife Service and the Department of Agriculture? 

Dave Bittner: (Laughter) Right. 

Ben Yelin: That just seems pretty far afield from what we'd expect from facial recognition technology. 

Dave Bittner: Yeah. 

Ben Yelin: And I think what that illustrates is that it's just a very commonly used tool in a way that I think not many people yet appreciate. 

Dave Bittner: Yeah. 

Ben Yelin: And where it can have an impact is the fact that there's data sharing going on between these agencies. So this article talks about how the Department of Homeland Security has a vast information network. It contains its own mechanism to request third party facial recognition searches through a database they keep of state and local entities. So, you know, when DHS has that and they have the capability of doing information sharing with the FBI and other federal agencies, then you know that this is beyond just, you know, one agency deploying facial recognition for a limited law enforcement purpose. You know that it's something that's far more pervasive than that. 

Ben Yelin: The perhaps disturbing part of this story is we know that facial recognition software, while it can be useful in law enforcement investigations, has significant problems. And we've talked about it a million times - algorithms that are not well tailored to suss out racial bias. And, you know, it's - this is something that's been well documented. It's been researched. The Department of Justice itself, as they mentioned in this article, did its own study. They studied the relationship between skin tone and false match trades in facial recognition algorithms. And guess what? There's a pretty close correlation. 

Dave Bittner: Right. Wait for it (laughter). 

Ben Yelin: Yeah. So, you know, I think that what's particularly illustrative about this report is that it's being used very broadly across agencies. It's being used as a tool to assist in agency coordination. And it's, you know, to put it in pandemic terms, we now have uncontrolled spread of facial recognition technology. It's out there. 

Dave Bittner: And I think it's important to also realize that it's really not that exotic anymore. You know, if you get yourself a, you know, a sort of, you know, a security system, you can check the box and include all sorts of recognition options. You know, they can - you can have license plate readers. You can have - automatically recognizing makes and models of cars, you know, so there's all - the - it's all out there. 

Ben Yelin: Tell me when the good-looking Amazon delivery driver comes by, right? Yeah. 

Dave Bittner: Right (laughter) exactly. Exactly. I need my box full of batteries. So this report quotes someone from the Electronic Frontier Foundation. And, of course, they're... 

Ben Yelin: Not thrilled. 

Dave Bittner: Their dander is up about this. They say "this important GAO report exposes the federal government's growing reliance on fascial surveillance technology. Most disturbing is its use by law enforcement agencies. Yet face surveillance is so invasive of privacy, so discriminatory against people of color and so likely to trigger false arrests that the government should not be using face surveillance at all," end quote. And the article also points out that there's no federal regulation on law enforcement's use of facial recognition technology, although there are some folks who are working on that. And they say that many states and cities do ban law enforcement and government use of the software, but local bans don't prevent use by the feds. 

Ben Yelin: Right. Obviously, federal power would preempt those local powers. We don't have federal legislation. As with all areas where we're relying on federal legislation, you know, including things like general data privacy laws, we might have to wait a while. I mean, the federal government moves very slowly on these things. Meanwhile, the agencies themselves feel unencumbered in using this technology. So there isn't really right now any sort of statutory check on these agencies deploying these tools. There's really not much that the courts can do because Congress hasn't passed a law. 

Ben Yelin: You know, obviously, how facial recognition is used in individual cases - that can have, you know, interesting implications in court. If, you know, it's used in a way that really violates somebody's reasonable expectation of privacy, then you might have a Fourth Amendment issue. In most cases, it probably doesn't violate somebody's reasonable expectation of privacy because either we voluntarily submitted our facial recognition data to some entity or it's picked us up while we're in public settings where we have this diminished expectation of privacy. So we don't have much opportunity for judicial relief, either. So in some ways, we're sort of out of luck at this point. 

Dave Bittner: Yeah. 

Ben Yelin: And, you know, the GAO is a way for us to get out of our state of denial about how frequently and how widespread the deployment of this technology is. And that's our opportunity, you know, if this is something that you care about to lobby your members of Congress to pass some sort of federal regulation. 

Dave Bittner: Yeah. All right. Well, we'll have a link to that story from MIT Technology Review. That'll be in the show notes. We would love to hear from you. If you have a question for us, you can send us an email to caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Peter W. Singer and Lisa Guernsey. They are from the organization New America, and they have an initiative that they have started up here. It's called "Teaching Cyber Citizenship: Bridging Education and National Security to Build Resilience to New Online Threats." Here's my conversation with Peter Singer and Lisa Guernsey. 

Lisa Guernsey: Yes. Well, this really was a meeting-of-the-minds report in many ways. The subtitle is "Bridging National Security and Education." And that's because my colleague and co-author on this report, Peter Singer - he's in the national security world. I'm in the education world. And the two of us were both recognizing at around the same time that there were some really big issues to grapple with when it comes to the way students - today's youth but also adults are taught about how to see and verify what's coming across their screens online and the social media platforms that they're experiencing and that this has real repercussions for national security but it also has a lot of repercussions for what we're teaching students in school and how teachers are equipped to do that kind of teaching. 

Lisa Guernsey: So that prompted us to pull this together. We are working with Cyber Florida, which is a partner on this project with us. And our co-authors, Nate Fisk and Jimmeka Anderson, had a big role in this as well. But we wanted to start pulling these threads together on the national security side and the education side. 

Dave Bittner: Peter, can you give us a little sense of the lay of the land here? I mean, as you all are putting this report together, where do we find ourselves? 

Peter Singer: So we obviously face major, major challenges when it comes to information threats. There's the traditional cyber threat, so to speak, hacking the networks. But we also have what I've called in the past LikeWar hacking people on the networks. It's the threat of misinformation, deliberate disinformation, conspiracy theory, hate speech, how that all comes together to damage our democracy, how it threatens public health, how it threatens individuals, how it expresses itself in extremism and terrorism, how it's also, though, just challenging to youth. 

Peter Singer: If, you know, they're trying to figure out, I've got an assignment to do a school report on who built the pyramids, and where do they go? They don't go to the World Book on a shelf. They now go online. They go on YouTube. And, you know, within a couple of hops, they're being told that the aliens built the pyramids. And they didn't, for our listeners. 

(LAUGHTER) 

Peter Singer: And so what do we do about this? And, obviously, there's been a lot of discussion and work on that. In fact, there have been over 450 different university think tank task force projects on this issue of information disorder. But they tend to focus on two types of remedies - either legal code change or software code change. The legal code change is the - you know, break up the companies. They're monopolies. Or change, you know, section - the section that I don't like what they're doing about free speech. It's basically that approach. 

Peter Singer: The second approach is software code change. It's the companies themselves. Change your algorithms. Change how you're steering people information, which often means we want you to change your business model, which they're less likely to change. Or it's change what you're regulating or not, all the way up to deplatforming people. Both of those - which, again, gets, you know, the focus of these more than 400 approaches to it. Both of those are great. They're important. I've been part of that. But they're not enough. And what we need to remember is the role of the individual, the target of all of this. And that's what this program has really been about. It's been about, how do we equip the individuals who are targeted by information threats to navigate this world? A lot of other nations do this pretty well - the Estonias, the Finlands of the world. The U.S. - we don't. And there's a variety of reasons for that. 

Peter Singer: But that's what this project, the Cyber Citizenship Project, is all about - is how do we bring together the different parties and agencies and organizations that are interested in this? And as Lisa put it, you know, there's just a massive amount of coalition-building to be done. You know, we linked everything from members of the NSA to high school social studies teachers. You wouldn't think they'd have a lot in common, but they're all dealing with the same threat right now. The other part is this research project - trying to get a scoping, a lay of the land. What are the threats? Where do we stand? And then finally, can we bring together the different tools to deal with this, the different curricula, the different teaching tools? And how do we provide them to the teachers themselves? 

Dave Bittner: Lisa, can we do a little defining of terms here? I mean, what does the term cyber citizenship embrace? How broad a spectrum of things are we covering here? 

Lisa Guernsey: Yeah, we see cyber citizenship at the intersection of three fields that are really starting to come together. One is media literacy, which involves everything from algorithmic literacy to just understanding authorship and who created what and why. But then the second field is civics and citizenship and increasingly digital citizenship. What does it mean to be a responsible participant in today's society? How do we do that online? And then the third field is cybersecurity and cybersecurity awareness. And these - the threats that Peter's just noted and that I know your audience knows so well, they involve everything from, of course, privacy and security and encryption, but increasingly are also about various individual actors online trying to funnel people into places where they might be seeing more and more disinformations, conspiracy theories. And so how do we understand that threat? 

Lisa Guernsey: So at the intersection of those three fields, that's where we see cyber citizenship. And it's that ability to have the resilience to understand and to fend off disinformation, misinformation and also increasingly malinformation, where it may be information that is, in fact, true, but was put out there to harm, to harm others. So it's starting to really understand that full landscape. And that's what we define as cyber citizenship. 

Dave Bittner: One of the things that I have to say was very pleasing to me is how much this report is based on research. And Lisa, can you go through and sort of explain to us what the research says about what works and what doesn't work when you're trying to help these kids acquire these skills? 

Lisa Guernsey: Yeah, so we certainly have a good base of research. There's, of course, also a lot still to do. So I'm going to put that out there from the outset. But over the years, there have been studies to look at what it is that really works with different students of different ages, helping them build awareness of their media environments, for example, helping them gather information about where the sourcing of the material is. One thing that we've noted in this work is that this can't be just about fact-checking. There have been some - OK, let's just tell students, here's where you get your information, and we'll put them on this very narrow path. And they're going to only look at this information and that alone. Well, that's not the way the world works for students today. They have their own devices. They're finding their own information in their own ways. So some of the latest research is looking more at what it looks like for students who are participating in the creation of information, sharing things on Instagram, liking things, recognizing where something's coming from, when a mean (ph) is something that's actually is distorting information for them. And that's some of the cutting-edge research on this. 

Lisa Guernsey: But just quickly I'll note that in the report, we talk about some very early but really promising research on some games that have been developed. So from the University of Cambridge, there's a game that's called Breaking Harmony Square. And it's a way of synthesizing the experience for a player of what it would look like to be the creator of disinformation and to kind of put them in that place of being the bad actor. But what that helps is to build awareness of, oh, this is how someone else might want to lead me astray. And by playing that game, that's led to more awareness and understanding of what's happening online and how to build your own resilience and prevent being led astray. 

Lisa Guernsey: So we're seeing really interesting studies like that coming out. And again, there's a need for so much more research. What we really want to do with this project is open up the lines of inquiry. What do we want to ask about the types of tools, materials and teaching strategies that work? How do we want to measure that? What do we need to understand about the context and the various student populations that are experiencing these kinds of teaching environments? And then, really get their input in it as well. 

Dave Bittner: You know, Peter, you mentioned that there are 50 states. And I think it's fair to say that right now, we are in a particularly divided era for our nation. And indeed, you know, one person's education is another person's indoctrination it would seem these days. Is it a challenge to find a common denominator, to find a starting point for this sort of thing that everyone can agree on? 

Peter Singer: You know, it's a great point, and that's why I'm usually a pessimistic guy, but I'm very optimistic about this approach. And so, you know, let's look at the challenges of mis- and disinformation. You know, they're - they play out in lots of different ways. The calls to deal with them get sucked into those divisive debates. So if you are expecting legal code change to solve this problem, good luck. We have an incredibly divided Congress that can't even agree on the problem, let alone the approach of it. 

Peter Singer: In turn, if you are looking for the platform companies to, you know, solve this on their own, they're not going to. That's just the hard reality of it. It's - certain elements of it are part of their business model. It's the way algorithms work. Or it's the fact that just consistently and somewhat naturally, they tend to react after the problem has occurred rather than getting ahead of it, when there's pernicious effects of things in the space, be it all the way back to early issues of child porn to, more recently, extremism or anti-vaccine conspiracy theory. Whatever it is, they often tend to react too late. 

Peter Singer: So where does that leave us? It leaves us with this third space. What's great about it is that it's nonpartisan, and it respects people's First Amendment rights. So the First Amendment rights element of it, it doesn't tell people what to say or what not to say. It's not about that. You - it fully respects your First Amendment rights. It's rather about equipping people with the skills to navigate this increasingly digital world safely and effectively. And those skills - and to the nonpartisan side - and this is why I think, you know, whether you're a D or an R you can get after this - is that they matter whether it's someone who's searching for information on the news, to public health, to - I think we can all agree we care about just our kids navigating a world of likes and lies, whether it's, you know, in their daily school life, their social life, what movie they ought to see, to is - you know, I use that - go back to that example of who built the pyramids. It doesn't fall into that partisan node. 

Peter Singer: Now, there's another part of this that I think is really crucial to hit - is that in no way, shape or form are we saying this is the only solution to the problem. No, it's not. But it is something that puts us in a much better place, a much better place as individuals. But also, if you get enough of people with these skill sets, you start to build a little bit more, you know, herd immunity, so to speak in the target space, in the society that's being targeted. It makes it harder for people who are seeking to manipulate others online to do their own job. And again, people seeking to manipulate others online, it might be a Russian information warrior. It might be an anti-vaxxer conspiracy theorist, an extremist. Or it might just be a company that's trying to manipulate someone online. But having a community that has more and more people with these skills puts us in a better place. 

Peter Singer: It's - if we think about cyber - traditional cybersecurity parallels, you know, cyber hygiene. No one would ever say having good cyber hygiene means that, you know, we don't need an NSA or we don't have to worry about ransomware. Certainly not. But I'd much rather be in a world where people - and I'd much rather be, you know, working in an organization where people are not clicking every single link that's out there. If I can reduce that, I'm in a much better space. And it's the same thing when it comes to these cyber citizenship skills. If we can build them in the U.S., we're going to be in a much better place. And you should really care about giving kids these skills. Whether the kid is a, you know, a son or daughter of a Democrat or Republican, I don't care. I want the kids to have those skills. 

Dave Bittner: Lisa, can we talk about the educators themselves? I mean, you know, most of the teachers I know are overworked and... 

Lisa Guernsey: Yeah. 

Dave Bittner: ...Under-resourced. How do we give them the things that they need without, you know, piling on one more thing for them to do? 

Lisa Guernsey: I'm so glad you raised that, Dave, because that is absolutely the case right now with our educator workforce, and there needs to be that recognition from the start. So this absolutely is going to have to be a joint and group effort. School librarians have a really big role to play and want to play a big role in this so that they're supporting our teachers who are in the classroom and helping to find these materials for them. School librarians, in fact, we think, will be a really key audience for this portal that Peter describes because they're often the ones that are finding materials that then can be used as supplements in classrooms. 

Lisa Guernsey: To the educators today, there is so much on their shoulders, and yet they are - and if we can support them as much as possible in this, we can really win because they recognize their role more and more. And they see that given how much today's students are using, again, like, you know, their own phones, their own devices. To be out there exploring the world, they need some - a little bit of scaffolding, a little bit of structure to understand what it is they're seeing on that daily basis, on that hourly basis through their phones, through their social media feeds so that they can build the habits of inquiry, the habits of mind to just ask themselves some questions about what they're seeing that's coming across their news feed and what they want to share and what they might be able to kind of participate in in more responsible ways. And so teachers recognize how crucial they are as being able to support students in that way. And when they can work together to build those environments for students to talk - talk about what they're experiencing online, talk about maybe a problematic meme - then they're really building new lines of communication and trust also with those students. 

Lisa Guernsey: So building support around teachers - this means funding them. This means new training. This means, again, support for libraries. And sadly, school libraries are often cut from budgets. We need our school librarians desperately, especially in this work. That's what's going to really help to kind of fortify or educate our workforce for this. 

Peter Singer: Dave, can I add two things to that? The first is the idea of how you need an integrated approach. So, you know, it's everything from, OK, we need to get the tools in teachers' hands, but there's also a policy side of this, a policy side of the state, local but all the way up to the national level. How do we, as Lisa put it - you know, it's everything from getting funding for these programs to getting senior leader attention to them to how we think about your prior question, updating our standards because teachers - they don't get exclusively to decide what is taught in their own classroom. They teach to what the standards are - and as these often need to evolve. But this links to a second part of it, as part of getting that support, as part of getting that policymaker attention is to recognize that education policy, national security policy, health policy - they're all touching upon each other right now. They're all connected, right? So this is a topic - you know, how we teach in our schools also is part of, how do we protect our nation? How do we protect our nation from foreign government disinformation threats? - to, how do we protect our nation from infectious disease, where it's not just a pandemic; it's an info-demic that surrounds it? 

Peter Singer: And so that's where we're hopeful that by framing this topic as, you know, you can't just see this exclusively as national security, or you can't just see it as a cybersecurity topic, or you can't just see it as a education topic. In fact, they connect. It also allows you to bring in very broader coalitions and maybe also think about the funding in a more effective manner. And that cuts across. That's both the government side. But let's be blunt. It's also about the nonprofit world side. If we all care about this problem, it's really interesting that, you know, over 450 of the task forces focused on one element of it and haven't focused on another element of it, which - I'll give you another data point that's interesting. When you actually - in surveys of reports and experts on what to do about information disorder, the most frequently recommended action item, one that the experts say - and this is from multiple different organizations, actually, from a study that Carnegie did of - gosh, it was over 80 reports from 50 different organizations. And they found the most frequent recommended item was to raise the digital literacy of the target audience, what we're talking about here. So it's the most frequently recommended. And yet it's the one that gets the least amount of policy attention, the least amount of funding, whether it's from government or from the nonprofit sector. So there's huge opportunity here if we just get our act together. 

Dave Bittner: All right, Ben, what do you think? 

Dave Bittner: It's a really cool and exciting project. I mean, I think it has a lot of long-term potential. It was good to hear about it. You know, I think they're uniquely situated in that they're a Washington think tank, but I think they have buy-in from both sides. And there's nothing partisan about being a good cyber citizen. 

Dave Bittner: Right. 

Ben Yelin: Protecting your own networks, focusing on data privacy and, you know, protecting our systems from national security threats. 

Dave Bittner: Yeah. 

Ben Yelin: So it was really interesting to hear about. 

Dave Bittner: Yeah, I enjoyed the conversation. And I was impressed at the depth with which they're coming at this. You know, they - my impression is - not surprisingly, but just I'm pleased to the degree to which they've done their homework here and are considering things like the history and all that sort of thing. It's not - and they have actionable things here. It's not just sort of pie in the sky, wouldn't it be great if we could do - no, No, they have (laughter) plans and... 

Ben Yelin: It's like a concrete curriculum, yeah. 

Dave Bittner: Yeah, yeah, exactly. So yeah, it really interesting to learn about. So our thanks to Peter Singer and Lisa Guernsey from New America for joining us. We do appreciate them taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.