Diversity has to be part of the mission in cybersecurity.
David Forscey: Diversity is a moral imperative, and it's an ethical imperative. But in cybersecurity, it has to be part of the mission.
Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: On this week's show, I've got the story of California upholding restrictions on Stingrays. Ben covers the ongoing issues with facial recognition software. And later in the show, my conversation with David Forscey from the Aspen Institute on their new Cybersecurity Collaborative Network. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: All right, Ben. We are back. And, of course, we took a scheduled break over the holidays, but we took an additional week off, and that was unscheduled for an interesting reason. Why don't you share with us what happened there, Ben?
Ben Yelin: Yeah, so I became one of the 20 million or so Americans who tested positive for COVID-19. So it was an interesting experience. You know, I obviously didn't have it as bad as people who lost their lives or, you know, people who've had really serious illnesses. But I was hospitalized for a few days for some low oxygen levels. Let's just say some stuff got into my lungs, which was not a pleasant experience. So I was in the hospital for about 72 hours. And very thankful to be back, thankful to be out, getting a little bit better every day and, you know, just very thankful to all the support I got, both from, you know, the medical professionals - a shout-out to the University of Maryland Medical System. They were amazing - from family and friends and from the whole CyberWire family, as well, for giving me some moral support. So, yes, it was an unscheduled break.
Dave Bittner: (Laughter).
Ben Yelin: I'm glad to be back. Certainly not an experience I'd recommend for anybody, but, you know, I think maybe it will give me some perspective and some insight on, you know, the probably the most important story happening in the world right now.
Dave Bittner: Yeah. Well, of course, we are relieved and happy to have you back. And as our Maryland governor says, wear your damn mask.
Ben Yelin: Yes. Wear your damn mask. When you have a chance to get vaccinated, definitely get vaccinated. Yeah.
Dave Bittner: Yeah.
Ben Yelin: I think that's the best we can do.
Dave Bittner: Yep. Yep. Like I said, we're so happy that you are well on your way to being recovered. And yeah, (laughter) it was - well, it's just - it's never good news when someone you care about comes down with something. But, you know, we were all - it was a bit of a kick in the gut for all of us here and me personally just because, you know, to have someone so close to you come down with something like this - I believe you, you know, sent me a message from the ER saying, well, Dave, we're not recording tomorrow, and here's why. I was like, oh, man (laughter).
Ben Yelin: It shows how much I care about the podcast that even from the ER, I was thinking about our loyal listeners.
Dave Bittner: (Laughter) Right. Exactly. Hadn't even sent a message to your wife or children yet.
Ben Yelin: Exactly.
Dave Bittner: No, it was the podcast that's first thing.
Ben Yelin: Yeah, I'll tell my parents later. But it's all about the podcast.
Dave Bittner: (Laughter).
Ben Yelin: No, but I - in all seriousness, the moral support was a big lift, you know, just to hear from people that they were thinking about me and hoping for me to get better and people from all different walks of my life. I mean, that's what really did lift my spirits.
Dave Bittner: Yeah.
Ben Yelin: And the whole CyberWire family - you, of course, first and foremost, was very supportive.
Dave Bittner: Yeah.
Ben Yelin: So I'm very, very thankful.
Dave Bittner: All right. Well, speaking of the podcast, let's jump into this week's podcast. Why don't you kick things off for us this week?
Ben Yelin: So a new year and a new instance of facial recognition software potentially being racially biased. My story is actually an op-ed from The Washington Post that says Unregulated Facial Recognition Must Stop Before More Black Men are Wrongfully Arrested." And the op-ed is based on a story that was actually first published in The New York Times from the state of New Jersey. An individual named Nijeer Parks was accused of shoplifting candy and trying to hit a police officer with a car in New Jersey. This individual, Mr. Parks, was, of course, nowhere near the crime scene. He was 30 miles away. He had a perfect alibi because he happened to be making a financial transaction at a Western Union. So it was not him. What happened is the real perpetrator had given law enforcement a fake ID. Law enforcement ran that fake ID through their facial recognition system, and Mr. Park's name came up. A warrant was issued for his arrest. He was arrested and held in prison for ten days. To make matters even worse, it was technology that actually kept him in prison for ten days because New Jersey uses an algorithm instead of cash bail to figure out how much a risk somebody is to the community. And because this individual had been arrested previously on drug charges, he had to sit in prison for a relatively long period of time having not committed a crime.
Dave Bittner: Wow.
Ben Yelin: So this is the third such instance in the past year, the third high-profile story of facial recognition software falsely identifying Black men and those men being arrested and potentially prosecuted. And we now have a major societal problem here. And what The Washington Post is calling for is a policy change, that there should be a moratorium on facial recognition software as a federal policy or as a state and local policy until we figure out the racial biases of these algorithms. And I think the more of these types of stories we're going to get, there's going to be more of a movement to put a pause to this type of technology. Obviously, it's very useful for law enforcement purposes when it's used correctly and used lawfully. It certainly aids in apprehending actual criminals. But when you start to hear more of these stories of innocent people being caught up in the system because these algorithms are racially biased, I think it certainly behooves all of us to step back and take a look at what's going on.
Dave Bittner: Yeah. And, you know, I guess it's worth noting, I mean, I certainly haven't heard any cases of misidentification when it comes to white people. Have you?
Ben Yelin: I have not. And this is the third high-profile case of specifically a Black man.
Dave Bittner: Yeah.
Ben Yelin: And we had talked about one from Michigan previously on this podcast, somebody who also went through a harrowing experience of being falsely arrested. So this is a problem that is very specific. It's that something about these algorithms are misidentifying Black men. And, you know, when that leads to the consequence of an innocent person spending 10 days in jail and paying, you know, $5,000 in legal fees to defend himself, then that becomes an unacceptable problem. You know, luckily, Mr. Parks does have some legal recourse himself. So he - his family hired an attorney, and he is suing the county in New Jersey for a bunch of torts, including false imprisonment. He hasn't really specified exactly what relief he wants. I would guess it might be monetary damages. But hopefully, you know, with the support of groups like the ACLU, he can get justice for what happened to him.
Dave Bittner: Right.
Ben Yelin: But in a broader sense, now that we've identified this problem - it's happened in a number of circumstances - I don't think it's acceptable for us to continue the use of this technology while this problem persists, you know, unless we really look under the hood and start to understand why this keeps happening.
Dave Bittner: You know, when I saw that you were going to be talking about this story, it reminded me of another story I'd seen come by last year, which was - it was a study from Georgia Tech and - oh, I'll link to a write-up on it. It's from the Streetsblog USA, which is a blog that covers transit issues, pedestrian issues, things like that - but talking about this study from Georgia Tech that found that automated vehicles may not detect darker-skinned pedestrians as often as lighter ones. They're between 4 and 10% less accurate when they encounter human figures with darker skin types, which - at the risk of sounding flippant here, it's - not only are you more likely to be falsely accused of a crime. You're more likely to get run over by a car.
Ben Yelin: It's just awful. I mean, it's awful that, you know, we've had a history of institutional discrimination in this country. Technology is an amazing tool. And, you know, when it's used for good, it could potentially be a way for us to get past some of this historical institutional discrimination. But now we're seeing that technology, largely because at its core, it's created by human beings, is perhaps just as racially biased as we are as people.
Dave Bittner: Yeah.
Ben Yelin: And, you know, it's just - it's something that I think as technology develops, we can't, you know, as a matter of policy, allow technology to develop and become prevalent, particularly as it's used by government institutions, unless we take account of this very serious problem. I think racial equity has to be a part of the development of these technologies and policymaking around these technologies.
Dave Bittner: Yeah.
Ben Yelin: You know, it can't just be sort of an afterthought after we have these high-profile instances of people being falsely arrested.
Dave Bittner: One of the things that the researchers at Georgia Tech pointed out was that the training data that they use on these artificial intelligence systems - they use 3.5 times more examples of white people than Black and brown people. So it would follow that they'd have more accuracy on white people than Black and brown people. But there's no shortage of photos of Black and brown people, right?
Ben Yelin: There certainly are not.
Dave Bittner: As you say, somebody's got to have their finger on the button to be equalizing these training things. I want to be careful to not be too sympathetic to the unconscious biases of the people who are doing this because I suspect they didn't set out to do bad, right?
Ben Yelin: Absolutely not. Absolutely not. Yeah.
Dave Bittner: But the results are what they are.
Ben Yelin: Right. Right. And, you know, we were talking about this offline. You could say, sure, you know, the research subjects are going to be more white because white people are still the majority in this country.
Dave Bittner: Right.
Ben Yelin: But, you know, part of taking racial equity into consideration with this technology is actively making decisions to augment that research. So whether it's larger subsamples of African Americans or people of other races, you have to make decisions with that context in mind. It has to be part of all levels of policymaking related to artificial intelligence.
Dave Bittner: Yeah. And you'd think that we've been at this long enough, that this has come up enough over the past - even just the past couple of years that anybody working on this would be laser focused on making sure that they had taken this bias out of their work. And yet here we are.
Ben Yelin: Yeah. I mean, you'd think so. You know, part of it is that, you know, the technologists themselves aren't - not to use - this is certainly an overused term, but they might not be woke enough to realize how prevalent these racial biases are.
Dave Bittner: Yeah.
Ben Yelin: And policymakers, in some instances, might look the other way because, you know, for something like facial recognition, it's such an effective law enforcement tool in their minds that perhaps they're willing to tolerate a little bit of racial bias. So that's why, you know, I think we kind of have to attack this from all angles, both while the technology is being developed, you know, kind of institutionalize that. And then as a policy matter, make sure that if a department adopts a technology, that it's been thoroughly vetted to make sure that these types of incidents don't happen in the future.
Dave Bittner: Yeah, yeah. All right. Well, it's certainly an interesting story. My story this week comes from the Lawfare blog written by Mailyn Fidler, and it's titled "Court Upholds Legal Challenge Under California Statewide Stingray Law." Ben, you know, there's nothing we love more on this show than a Stingray story, right? (Laughter).
Ben Yelin: And then when you combine California and Stingrays, this is very on brand for us.
Dave Bittner: Yes, your dear hometown, home state of California, and Stingrays. This is an interesting read for a couple of reasons, and I'm really looking forward to digging in here with you because I think this is - I mean, talk about a conversation about policy and how policy gets done and policy gets made. So if we rewind here - I'm going to try to give an overview here, and please step in, correct me if I'm wrong or if I misspeak. But California, back in 2015, put in place a state law which governed the use of cell site simulators - so Stingrays. These are the devices we've spoken of that are able to simulate a cellphone tower, and law enforcement can use them to basically track the use of mobile devices within a certain area. And, of course, these are controversial because not only do they capture the people who they may be targeting, but they capture everybody in the area.
Ben Yelin: Oh, all of us, yes.
Dave Bittner: All of us, right. So there's a privacy issue there. They were originally developed and sold to federal-level law enforcement for the reasons of - stated reasons of fighting things like terrorism. But more and more, they've trickled down and are being used for local petty crime kinds of things (laughter).
Ben Yelin: Right.
Dave Bittner: So somebody shoplifts from a local 7-Eleven, and next thing you know, somebody's using a Stingray to track them down.
Ben Yelin: Yeah.
Dave Bittner: And so on and so forth. So the California law required that localities have a real transparency regimen in place here. They had to have - before they implemented this sort of technology, before they purchased it, they had to have public meetings about it. It had to be part of their local process. So it had to be upfront. The public had to be able to comment about it ahead of time. Do I have it right so far?
Ben Yelin: You do. Yeah, you do. So this was a statute that passed in 2015 in California. The author of this article - and I believe she's correct - says that this is the only law of this kind of the country, where, you know, it's not about mandating a warrant for the usage of a Stingray in an individual circumstance; it's a broader requirement for these sort of procedural steps that have to be taken - a public hearing, a chance for the general public to weigh in on this policy. So that's exactly right.
Dave Bittner: Yeah. So what happened over time is that some of the localities found ways to massage how they (laughter) followed this regulation. And, of course, that...
Ben Yelin: Yes, that's putting it mildly. Yes.
Dave Bittner: (Laughter) And that led to some lawsuits, which is, you know, how these things get tested. And the upshot of it is that the courts have upheld it. Am I still on track here, Ben?
Dave Bittner: Yeah. Now, the thing that really interested me here that I had not considered, that this story points out, is how a law like this changes the equation when it comes to how a device like a Stingray is considered by the general public, how it moves the reaction to a Stingray from being reactive, from happening after the device has been used, someone has been charged with a crime - that sort of thing, which they point out in this article is a slow process. It attempts to take that process and move it to the head of the line. So before someone - something happens to someone, as a community, people have a chance to talk about this to decide whether or not this is something we want. I think that's a really interesting point from a policy point of view.
Ben Yelin: Yeah. So as this article - and I think this gets at what you're saying. What the article says is one of the problems with Fourth Amendment jurisprudence is, oftentimes, the only chance you get to challenge this type of surveillance is after you've already been arrested and prosecuted. And we want to have a chance to challenge these types of policies before you're behind bars and fighting for your freedom.
Dave Bittner: (Laughter).
Ben Yelin: So, you know, I think that was - that's really the purpose of this law - is to allow the public to weigh in in a situation where they're not facing criminal prosecution and are desperate and are, you know, only fighting for themselves. So I think it helps alleviate a - sort of a prophylactic that helps alleviate one of the biggest problems with the Fourth Amendment - is that you can really only challenge it after you've been caught and have to go through a criminal proceeding.
Dave Bittner: Right.
Ben Yelin: So that's what I think is so interesting about this law. And, you know, I think it's good to see that the court is is taking it seriously. Oftentimes, you have public meetings laws where it just becomes a box-checking exercise. We see this a lot on the federal level with administrative law, where they kind of go through the motions of allowing notice and comment before a final rule goes into place. But it's not as stringent, you know, enforcement as it potentially could be. So it's good to see that the court in California is requiring localities to take this seriously.
Dave Bittner: Yeah, yeah. No, it's an interesting article. And they say here that other cities in California may follow suit - I guess, literally suit (laughter) the suit...
Ben Yelin: Yeah.
Dave Bittner: ...That, you know, people are going to challenge this. And it may make its way around the nation as other states and localities follow what's going on here. Interesting little side note - they mentioned that Harris Corporation, who's the the main manufacturer of Stingrays - they've said that they're going to stop supplying local agencies. I don't know what that really means. I mean, my sense is that there are a lot of these things out there and that if I'm a local organization and I want to use one, chances are I can - I don't know - call the state and borrow one or call, you know, like...
Ben Yelin: They'd probably figure out a way to obtain one. Yeah.
Dave Bittner: Right. Right. Right. But that seems to me probably mostly a PR kind of thing more than anything. But...
Ben Yelin: Yeah, I think it's that the Harris Corporation does not want to be associated with the negative aspects of Stingray technology.
Dave Bittner: Yeah. So interesting article. Again, that's over in the Lawfare blog. We'll have a link to that in the show notes - definitely one that is worth your time. And those are our stories this week.
Dave Bittner: Of course, if you have any questions for us, we would love to hear from you. We have a call-in number. It's 410-618-3720. That's 410-618-3720. You can also email us. It's email@example.com.
Dave Bittner: Ben, I recently had the pleasure of speaking with David Forscey. He is from the Aspen Institute. And they have an initiative called the Cybersecurity Collaborative Network. Interesting stuff. Here's my conversation with David Forscey.
David Forscey: So the Aspen Institute is what many people would term a think tank, but we're not supposed to use that term because, really, what we specialize in is - while we do a lot of research and a lot of typical think tank work, where we really specialize is convening. It was founded many years ago in Aspen, but our headquarters is in Washington, D.C., and we convened across a range of areas. We - CEOs, young entrepreneurs in various areas. And what the Aspen Cybersecurity Group does is not just convene chief information security officers, which, you know, a lot of groups do, but we also have, you know, Vint Cerf from Google. We have the general counsel at AIG and Apple. We've got the CEO of Northrop Grumman and Johnson & Johnson, two congressmen - Congressman Will Hurd and Congressman Jim Langevin - leading researchers. And then we have former directors of the NSA Michael Hayden and Keith Alexander. So when this group speaks, we really try to speak with many different perspectives. And that's essential because what we're trying to do here is operationalize consensus solutions to big cybersecurity problems that affect multiple sectors, private sector and government, you know, big national security level, cybersecurity problems. And when we speak with this many perspectives, the ideas we advocate for are less likely to run into the kind of Washington buzz saw that often greets really great ideas in cybersecurity, if that makes sense.
Dave Bittner: Yeah, it absolutely does. So, I mean, this is - the publication is titled "A National Cybersecurity Agenda for Resilient Digital Infrastructure." What was the impetus for creating this report?
David Forscey: So, obviously, we have - there's a lot going on right now. We had an interesting year, and we have a major transition in Washington coming up. We're going to have a new White House, going to have new Congress, new chairmanships and staffers coming and going out. And there's about to be a lot of change in D.C. And we wanted federal policymakers specifically to be focused on areas where we thought we could really move the needle in the next term. For the next two to four years, where are some areas where we can really make tangible progress? So that tends to be areas where there's already a foundation for a lot of success but where, you know, we just need some oomph from the federal government to really scale success. So that's what we try and lay out here. Now, there are some things that we don't touch on for specific reasons. For instance, we don't speak really at all about offensive cyber operations, a lot of aspects of the black budget because a lot of others are focused on that. And we actually feel like, sometimes, that gets a little too much attention. And there needs to be a little more focus on Congress on the defensive aspect of national resiliency.
Dave Bittner: Well, one of the things that caught my eye here in the report is that in the foreword, you begin by speaking about a public health crisis that took hold of London. Listeners to this show know that I often find that to be a useful metaphor to compare cybersecurity to public health. Can you take me through your perspective on that aspect of it?
David Forscey: The pandemic is - it's an interesting time to be discussing this because before the pandemic, many public health experts were warning for many years that we were going to hit a serious problem, and we finally did. And it was clear that we just didn't make the investment to prepare for it. And in cybersecurity, it's really the same way. It's like a boiling frog, right? We have hospitals being hit every day. We have new breaches occurring every day. We now are in the midst of the SolarWind investigation, which could end up being one of the more significant cyberattacks we've had. And yet we still just pay lip service to it. And we - companies do not invest in the ways they should. Government does not invest in the ways they should. Now, I don't know who it was that said, show me your budget, and I'll show you your priorities. And so we just want people to start treating it less like a problem, which often gets people to shut down. We know that the cyber 9/11 and cyber Pearl Harbor analogies just haven't really worked. They haven't moved the needle in terms of people's attention. And to start thinking of more as just the foundational infrastructure on which society depends. And it's just like water infrastructure, right? No one would claim that we do not need water infrastructure to keep our water clean. It's same thing. Digital data is water. Cybersecurity is the infrastructure that actually makes it work for us. So that's why we opened with that metaphor, if that makes sense.
Dave Bittner: Do you suppose, I mean, we need something along the lines of the EPA for cyber, an organization who's responsible for looking over those things?
David Forscey: That's something that some people have recommended. Personally, we don't take a position on that in this report. Creating a whole, new agency for nationwide and whole-of-society cybersecurity would not be something that's probably doable in the next term. And what we really want to focus on here is actionable things that can move the dial. Most, if not all, agencies do have offices that focus on their particular area of the problem in cybersecurity. But it's something that certainly should always be open for discussion.
Dave Bittner: Well, there are five areas that you all focus on here in the report. Can we go through those together?
David Forscey: We start with education and workforce development. And this is something a lot of people in the space are very familiar with. We have about 520,000 open cybersecurity roles in the United States today. And we just haven't made a lot of progress. You know, in the past - I think it's two years - our open positions have increased by 62%. And it's just outpacing our our ability to actually fill these jobs. We start with the assumption that one of the reasons that's the case, one of the reasons we're not making more progress is we're fundamentally restricting our view of cybersecurity talent, right? There's about 200 million people - I think it's 212 million people of working age in the United States. If just 0.2% of those people are capable and interested in performing cybersecurity roles, we have more than enough to fill 520,000 jobs. But the way a lot of organizations hire and search for cybersecurity talent, it's kind of outdated. And it really has a narrow view of who has potential, right? So a lot of open positions in cybersecurity, a lot of entry-level positions. They require a four-year degree and a CISSP certification. Well, CISSP is really a mid-level management certification. And a lot of people don't have access to the funding required to complete a four-year degree, despite the fact that many of those people would perform excellently in those roles. So one of the things we can do to expand that talent aperture and make sure that number one, companies are, you know, reviewing their job qualifications, make sure they're not onerous. And I can tell you when I speak with a lot of CISOs, a lot of companies about this, they will just nod their head vigorously. They go, yes, why are we requiring that?
Dave Bittner: Right (laughter).
David Forscey: You know, so we just need to have them - they just need to sit down with our HR folks and talk about that, right? And they need to have evidence that shows other companies have done it, and it works. It's also things as simple as changing the language in job description. How you actually describe roles has an impact on who applies for them. And companies like Cisco and IBM that have changed their job descriptions have really seen an increase in the number of women and the number of people of color who are applying for these jobs because, ultimately, diversity is a moral imperative, and it's an ethical imperative, but in cybersecurity, it has to be part of the mission because we just can't fill these roles if all we go for are MIT grads with 4-year degrees. That's the education piece. And what we get into here in this report is how the federal government can assist. Even things just like improving grant funding for a lot of the organizations that are already working on this - there's some really great organizations out there - cyber.org - I'll shout out NPower. Really, they're doing great stuff. And they just need more funding to scale what they're already doing. So we go into things like that.
Dave Bittner: Well, let's move on to public core resilience. What do you suggest there?
David Forscey: So this is one of my favorites and one that, depending on your listeners, they may be more or less familiar with. Basically, the internet runs on a foundation of hardware and software that no one person runs. So here we're talking about things like the domain name system, border gateway protocol, the protocol that actually sends data zipping around the world, lets computers know where to send the data, things like public key infrastructure, the protocols that allow you to speak secretly with strangers online. These things have fundamental vulnerabilities. And no one person is in charge of closing those vulnerabilities. So this is an area where governments can show more leadership. Now, this is ultimately a private sector problem. The government in cyberspace are run by the private sector. But in some cases, government can shape incentives to make sure that we're doing the things that we already know need to be done. Solving the vulnerabilities that affect the public are frequently not a technical problem. I'll tell you I was just speaking earlier today with some folks about GPS receivers. So we know how to make GPS receivers that are much more resilient to spoofing and jamming. And this is important in the public core because position, navigation and timing is a very important component of a lot of digital infrastructure. So we know how to make these receivers better. But a lot of customers aren't aware of the threats, and so they are not demanding that the GPS receiver manufacturers actually make more resilient receivers. So what is the government role in increasing the awareness among customers so that they demand and create market demand for more resilient GPS infrastructure? So that's a lot of what the public core is about. It's generally a lot of areas where not one person is in charge, and there's a big coordination problem, and that's where we think the government can really play a role.
Dave Bittner: Well, let's move on to supply-chain security. Certainly top of mind for a lot of folks, as you and I record this today.
David Forscey: Yes. So bottom line is - and we don't say this in the report, but if you read between the lines, we can't target supply-chain interventions just based on country of origin. If you spend all your time focusing on Huawei and ZTE, you might overlook the fact that supply-chain risks - and CISOs know this - are everywhere, including from U.S.-based based companies such as SolarWind. We talk about two levels of supply-chain security, right? One is national level policy to ensure that we always have robust market competition so that companies are never dependent on insecure products. So that is the kind of Huawei scenario - not in the United States, but, you know, in Europe, you have some companies that can simply only afford to use Huawei. You never want a situation where a company only has one option for a product, and therefore, there's less incentive to make that product safer. So you can incentivize better security by incentivizing market competition. That's kind of the national level. But then you just want to make sure that organizations are aware and are doing the proper best practices to manage their own supply chains. And you want to make sure that there's transparency in the products so that organizations can do a better job managing their supply chain. And here we get into things, you know, making sure that things like the software bill of materials project, which is ongoing at the National Telecommunications and Information Administration, to make sure that more and more people can have an ingredients list of what actually goes into a piece of software so that when a new vulnerability is detected and published as a CVE, a CISO can go and say, now, wait a second. Am I running software that's affected by this? Oh, I know I am because I have those ingredients. So transparency is really important to make sure that organizations can manage their risk. And at a more national level, you want to make sure we have the government incentivizing market competition in the most critical sectors to ensure that there are market incentives for better security.
Dave Bittner: And then the last two areas you focus on out of the five are measuring cybersecurity and promoting operational collaboration. Can we go through those?
David Forscey: Yeah. And I should point out that these are not in any particular order. These are all equally important in our minds. So measuring cybersecurity - so right now, we know where COVID-positive tests are taking place. We generally know which hospitals are at capacity. And we can make national, state and local level policy decisions based on that data. We have nothing like that in the cybersecurity space. We don't really know who's getting hit. We don't know why they're always getting hit. We don't know what resources they - people have. We need a concerted government effort to start combining data collection so that we can start making evidence-based policy because if you're not making evidence based policy, you're just throwing spaghetti and see what sticks. So that's essential going forward for the new administration - is to really focus on how we can start actually measuring whether we're making an impact. And the place where we start is just the high-level data. Then we move on to the metrics. And there's a lot of good stuff going on in the industry about developing metrics. And we feel the first step is for the government to start collecting higher-level data, sharing that, so researchers can determine whether policies are actually being effective.
David Forscey: And promoting operational collaboration is really about moving beyond the information-sharing discussion to start coordinating joint action and planning between high-capability actors in the private sector and high-capability actors in the federal government. So this is to make sure that - let's say a major company is about to launch a takedown operation of a botnet controlled by a criminal organization. You want to make sure that government agencies are aware, can coordinate their own takedown activities and don't get in the way because if they move too early, they might tip the hand of private sector organization, screw up the whole operation. If you want to make sure that all these people really know each other - and this might sound wishy-washy, but the fact is you want to start building friendships between, say, analysts at this bank and analysts at this government agency because that's when you start building the trust that's required to actually overcome a lot of the cultural barriers that exist.
Dave Bittner: It seems to me - and let me know if you agree with this - that cybersecurity is one of the few areas that has managed, in this topsy-turvy political environment that we find ourselves, to still have bipartisan support. It's hard to find someone who says that better cybersecurity is a bad thing. First of all, (laughter) is that an accurate assessment?
David Forscey: Well, you know, that's why it's really - when you compare it to, say, the disinformation and misinformation space - which we do touch on in this, but we felt it deserved an entirely different report - there's less agreement there on what is bad and what is good. In cybersecurity, there's really full agreement on what we're trying to do, so I think that's why it makes it easier. And I think you look no further than things like the U.S. Cyberspace Solarium Commission, which was fully bipartisan. You look at the recent National Defense Authorization Act, which had some really major cybersecurity provisions in there that stuck in there to the end. And it hasn't been enacted yet, but it almost certainly will. And so I think, I mean, you hit the nail on the head. Everyone generally agrees on the goal. And that gets you a lot - that gets you quite a bit of the ways. You know, where the disagreements tend to crop up is how we actually get there. What we're trying to do is point out some ways where there's generally a lot of agreement on even how we get there. Yeah, that's what we're trying to do here.
Dave Bittner: And what do you say to that person who's listening to this, who has an interest in this sort of thing, would like to see these sorts of things move forward, you know, supports your efforts? How can they participate? Is it as simple as reaching out to their representatives?
David Forscey: First of all, they should always be - feel free to reach out to us. One of our roles is to act as a nexus for the cybersecurity policy community. So anyone who's interested should reach out to me, David Forscey at the Aspen Institute - happy to talk to you, happy to figure out how to fit your capabilities into what we're trying to do. So that's No. 1. No. 2 is that there's just - more members of Congress need to see this as a priority. And it's easy to say it's a priority. They need to hear from their constituents. They need to hear from the businesses in their communities. They need to see that ignoring this has a tangible impact on their constituents. And so I would certainly encourage everyone to - people might be surprised. Writing and calling your congressperson actually has an impact, especially when people do it a lot. They pay attention. Their staffers take the calls. They read the letters. And they send that information up. So that is always an effective way.
David Forscey: I think another way is to just make sure that if you're interested in this issue, you need to read up on the technology, but you also really need to read up on the law and the policy behind this. And you need to read up on the economics. I think it was - maybe it wasn't Dan Gear, but I think it was him that said - it was that amateurs study security, professionals study economics. And a lot of the problems in this space are so tough because of the economics behind it. It's very easy to just say, oh, this is a great idea - why doesn't the industry do it? But if you don't understand the industry incentives behind it, you won't understand why it's been such a problem for 20 years and why still have progress on X, Y or Z issue. And so I think it's really important for anyone who's interested in getting involved to study technology, policy and the law and the economics behind all of it.
Dave Bittner: All right. Ben, what do you think?
Ben Yelin: One of my favorite things about that interview is the metaphor that he made between public health and cybersecurity. And it is so well-timed that in the report that he wrote, he used an example based on public health. But it's this idea that it's a problem that experts bring up. You know, you have infectious disease experts during nonpandemic times saying, guys, this is going to be a big problem. You know, if there's a highly transmissible respiratory disease, we're all going to be really screwed. You have cybersecurity professionals saying, guys, if there's a major cyberattack on our critical infrastructure, you know, we're going to be screwed.
Dave Bittner: Right.
Ben Yelin: And no one ever really listens to them. I mean, as he said in the interview, people might say like, all right, we'll throw a little bit of money here to ameliorate the risk, but nobody ever takes it seriously enough until it actually happens. You know, and now with the cyberattack that we've seen over the past few weeks in this country, combined with what's happened with the pandemic, you know, these events that are unlikely to happen do end up happening.
Dave Bittner: Yeah.
Ben Yelin: So I just thought that was a really interesting metaphor for him to use.
Dave Bittner: Yeah, yeah. I'll tell you, I have come to believe over the years that we are a reactive species. We are (laughter) not good at getting ahead of problems, generally.
Ben Yelin: No, I mean, there are a few of us that are good at analyzing risk and anticipating risk, you know, especially in the emergency management field, which I'll give a shout out to.
Dave Bittner: Yeah.
Ben Yelin: But, you know, for most people, it's just getting to the next day. We have a limited number of resources. Why should we be spending billions of dollars on something that might happen in the future?
Dave Bittner: Right, right.
Ben Yelin: Well, it turns out there are good reasons to do that.
Dave Bittner: Yeah. Well, again, our thanks to David Forscey for joining us. He's from the Aspen Institute. That effort is called the Cybersecurity Collaborative Network. If you're interested, do a search for that. I'm sure you can find all sorts of information about it. We thank him for taking the time for us.
Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.