Caveat 3.18.20
Ep 20 | 3.18.20
Dressing for privacy.
Transcript

James Stavridis: We've come from punch cards and BASIC as a language and Cobol as a language to a need to create a separate branch of the armed forces because of the inherent complexities of cybersecurity. 

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hi, Dave. 

Dave Bittner: On this week's show, Ben shares a story about dressing for privacy. I've got the tale of location data putting an innocent man at the scene of a crime. And later in the show, my interview with Admiral James Stavridis. He is the former supreme allied commander of NATO. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be right back after a word from our sponsors. 

Dave Bittner: And now some reflections from our sponsors at KnowBe4. What's a policy? We've heard a few definitions. Some say it's an elaborate procedure designed to prevent a recurrence of a single nonrepeatable event; others say it's a way that the suits play CYA. Still, others say it's whatever happens to reside in those binders the consultants left behind them right before they presented their bill. How about a more positive approach? As KnowBe4 can tell you, better policy means better security, and getting the policies right is a big part of security; so is setting them up in ways that your people can actually follow them. We'll hear later in the show about how you might approach policy. 

Dave Bittner: And we are back. We've got some good stories to share this week. Ben, why don't you start things off for us? 

Ben Yelin: Sure. So as one could probably tell by looking at me, I'm no fashionista. And this is not a fashion... 

Dave Bittner: (Laughter) Let me just say, even though this is a podcast, I can vouch for that. 

Ben Yelin: Yeah, he can confirm. My shirts are frequently ill-fitting. 

Dave Bittner: (Laughter). 

Ben Yelin: And this is not a fashion podcast, but this story actually has fashion elements to it. And it comes from The New Yorker. It's called "Dressing for the Surveillance Age." And this author went through the process, based on consulting a number of academic experts in the cybersecurity field, about designing what he called an invisibility cloak - basically, an outfit or costume that would protect him against the pervasive surveillance state. So that consisted of a number of things. He had something in his pocket called Silent Pocket, which he carried around his phone that would prevent his cellphone location from being tracked. He had a cloak that would shield his face from facial recognition technology. He had another item of clothing that would protect him from license plate readers and so on and so forth - so, based on consultation with experts, had protected himself via his clothing from a number of surveillance techniques. 

Ben Yelin: One thing that was particularly interesting about this is the clothing itself or the items that he used to protect himself were designed by artificial intelligence. So it's the artificial intelligence that's figured out how to recognize people. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: They've figured out the surveillance tools - you know, which pixels to look at to recognize people. And because they've done that, artificial intelligence is incredibly useful to establish what's called adversarial indications - so basically, something that you're wearing or using on your face or using on your body that will confuse the artificial intelligence into thinking that either you do not exist or that you are somebody else entirely. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: This fascinates me because, as the article points out, the researchers aren't exactly sure why certain patterns throw off these automated readers because they view things differently than humans do. And so they find these patterns that seem to work, and they're not always sure why (laughter). That's a fascinating and perhaps slightly unsettling reality, I suppose. 

Ben Yelin: Yeah. And I think it takes a lot of trial and error to figure out exactly what these machine learning technologies are recognizing. You know, to our eyes, things are relatively simple. We process images based on what we've already seen. I think this article goes through example of, I see a cat, my brain is triggered, I recognize that as a cat because I've seen a cat a million times. 

Dave Bittner: Right. 

Ben Yelin: Probably too many times. 

Dave Bittner: (Laughter). 

Ben Yelin: But, you know, machine learning and artificial intelligence doesn't quite work that way. It's frequently checking against a database, and whatever the actual image that's being taken or detection that's being used has to be matched against a larger base of data. And so it kind of works, as you said, in mysterious ways. So I think that's one of the things that was so interesting about this article. 

Dave Bittner: Yeah, it strikes me, too, like, how with our human visual system, sometimes at first glance, you think you see something. For example, I remember, you know, driving in my car one time, and at first glance, I thought there was a turtle in the road. And I thought, oh, gosh, you know, I better swerve to not hit this turtle. But when I looked at it closer, it was just a paper bag or something. 

Ben Yelin: Right. 

Dave Bittner: And once my brain realized it was a paper bag, I could no longer see the turtle. 

Ben Yelin: Right. 

Dave Bittner: Right? And initially, I believed it was a turtle. And so it's fascinating to me how these artificial systems do and do not work in similar ways, how we can fool them. One of the things that struck me in this article was not only just hiding but this notion of poisoning the data by using patterns that were covered with license plates. What was going on with that? 

Ben Yelin: Because we have this pervasive system of automatic license plate readers, ALPRs, it's very easy for law enforcement to track vehicle movement. I think you had shared the story of somebody who wore a dress laden with a bunch of license plates belonging to abortion providers. 

Dave Bittner: Right. 

Ben Yelin: And this was not somebody who was anti-abortion; this was somebody who was pro-choice, and they were doing that to throw off these license plate reading systems. So if there were readings at locations where those individuals weren't actually located, that would throw off law enforcement. 

Dave Bittner: Right. 

Ben Yelin: You would have false positives in terms of those license plates showing up on our surveillance systems. 

Dave Bittner: Well, and not just law enforcement because the ALPR data is widely available to all sorts of people. You can buy that. 

Ben Yelin: Absolutely. As we've talked about on this podcast, it's very commercially available. A lot of companies and corporations use it for non-law enforcement purposes. And they also collect that information, put that in large databases, which eventually law enforcement can access. So this is a, you know, another thing that's very interesting about this article. Sometimes it's not just cloaking yourself or preventing certain types of detection or using adversarial indications which try and prevent the artificial intelligence system from recognizing you; sometimes one thing you can do is actively confuse the surveillance system into thinking you are somewhere where you are not or into thinking that you are someone you are not. 

Dave Bittner: Yeah. 

Ben Yelin: I mean, this is what I assume - when we talk about things like Tor, it's basically the same rationale. You are trying to trick whomever is watching into thinking that you are somewhere where you aren't actually located. 

Dave Bittner: Right. 

Ben Yelin: And I think that this article gets at how, as we've entered an age of more pervasive surveillance, the measures are going to get increasingly drastic. I don't think we're all going to be wearing invisibility cloaks, but there is going to be a market for products that can confuse these artificial intelligence systems, these surveillance systems. And one thing I think I mentioned to you is we need that market to exist because our legal system is so far behind in protecting people against these types of surveillance techniques, protecting people's personal privacies. If the legal system is not going to fulfill that role, then the private sector has to step in and create a marketplace for things that can protect people's personal security. Normally, we've thought of those things, like, you know, robust encryption systems, but this article indicates that we can get a lot more creative in how we fool our surveillance overlords. 

Dave Bittner: Yeah. You know, recently, when I was out in San Francisco for the RSA Conference, we were doing a little bit of sightseeing. And we were out at Fisherman's Wharf, and there was a gift shop there, you know, kind of place where they sell sweatshirts and little knickknacks and so on and so forth. And they had a collection of masks that were like a stocking, that you'd pull over your face, pull over your head, but printed on that mask was a complete head of someone else. 

Ben Yelin: Yep. 

Dave Bittner: So you could look like an old man or a woman or just - there was a variety of them. But it struck me as interesting that, first of all, this thing exists, right? (Laughter) You know, for $10 at a gift shop, you could put one of these on... 

Ben Yelin: And avoid facial - you know, facial recognition. 

Dave Bittner: Right. Right. But the other thing I wonder is where does this put us in terms of social norms? In other words, if I'm walking down the street and I see somebody coming down the sidewalk who's wearing some sort of disguise or mask, that's probably going to put me ill at ease. 

Ben Yelin: Yeah. I think the author mentioned it in the context of this piece as well. One of the experts he spoke to was basically like, look; you can go on the street with a bunch of tinfoil, you know, attached to your body and your head. 

Dave Bittner: Right (laughter). 

Ben Yelin: And that may actually prevent certain types of surveillance. You could be wearing a full tinfoil body suit. Ultimately, that's not going to help you evade detection because every single person on the street is going to look at you and think that you're, you know, the strangest person that they've ever seen. 

Dave Bittner: Well, let me ask you this, though. From a legal point of view, let's say I buy myself one of these pullover masks and I'm walking down the street for everyone in the world to see and I happen to walk by the local police officer who's just, you know, doing his duty patrolling the neighborhood, what are the range of responses that that police officer is allowed to have to me in my mask? Is it legal for me to be walking down the street wearing a mask? 

Ben Yelin: Absolutely. Yeah. Nothing I know in any legal regime would prevent that. You know, I think there are certain circumstances where the police could force you to actually identify yourself. But they would have had to have some reason to pull you over in the first place, or, you know, they'd have to have some level of suspicion. 

Dave Bittner: But is the mask itself enough for me to be suspicious, for the police officer to say, hey, buddy, take off the mask? 

Ben Yelin: I don't think so. Otherwise, Halloween would be filled with... 

0:10:38:(LAUGHTER) 

Ben Yelin: ...A lot of unnecessary detention of children who are wearing, you know, masks from the movie "Scream" or something. 

Dave Bittner: Interesting. 

Ben Yelin: And, you know, all of us, to a certain extent, wear the types of things that might fool surveillance systems. You were talking about how somebody you know, their face ID on their iPhone will not open if that person's contact lenses are not in. 

Dave Bittner: Right. 

Ben Yelin: Because it only recognizes the face that it - was established by the contact lenses. 

Dave Bittner: Right. Right. Right. 

Ben Yelin: So if we start to ban all different types of disguises or if that serves as a rationale for law enforcement, then where do we draw the line? Are we going to force people to - because, you know, a facial recognition system can't pick up a certain type of glasses or contact lens, are we going to force people to stop wearing those products? I just don't... 

Dave Bittner: Well... 

Ben Yelin: I don't think that's practical. 

Dave Bittner: Right. And there's the practical everyday situation that there are plenty of people who wear clothing every day that covers their whole body, for religious reasons, for - you know, someone's wearing a burqa, that sort of thing. 

Ben Yelin: Absolutely. 

Dave Bittner: And they're well within their right to do that. So... 

Ben Yelin: In the United States, they absolutely are, yes. 

Dave Bittner: Yeah. Interesting. Interesting. 

Ben Yelin: So yeah, I mean, that's why the story is so interesting. If you really did want to go (laughter), you know, the full nine yards and protect yourself against surveillance, there are very creative ways to do so. And I would recommend reading the full article because - it's very long, but it explains, you know, basically every element of this invisibility cloak and specifically how it works. 

Dave Bittner: Yeah. It's a good article. It's from The New Yorker. We'll have a link in the show notes. My story this week comes from the Naked Security blog. This is from the folks over at Sophos, written by Lisa Vaas. It's titled "Google Data Puts Innocent Man at the Scene of a Crime." And this is about a gentleman named Zachary McCoy. He's from Florida. Always seems like interesting stories come out of Florida. 

Ben Yelin: Yeah. 

Dave Bittner: Area - Florida man does X. 

Ben Yelin: (Laughter) Right, exactly. 

Dave Bittner: So this gentleman, Zachary, he was notified by Google that Google had received a legal request for his location data. Now, this is something that Google does as a matter of course. If they get a request like this, they will notify the user that that request has come in. Now, what happens is this puts into place, basically, a seven-day time period where he can appear in court and block the release of that data. 

Dave Bittner: So this gentleman looks into what's going on, and turns out there was a burglary that had taken place near a route that he takes to ride his bicycle to his job. So like most of us, I think we have our devices, and they're beaconing out, sending our location data - in this case, to Google. And he happened by this home that had been robbed, and that put him in the situation of, in this case, initially being their No. 1 suspect. 

Ben Yelin: Yeah, it's really something that we're seeing more frequently, this technique of what's called geofencing, where based on the location of a particular crime - in this case, it was a burglary - law enforcement requests or subpoenas from a telecommunications company every person matching that description who was in that location. So usually, they'll get, you know, a full list of devices which have unique identifiers that aren't that person's name that were within X radius of the crime scene at that particular time. And you know, they can start to cull the list based on their individual suspicion. 

Ben Yelin: So you know, let's say you know that the burglary took place over a period of several minutes. Well, maybe somebody who was only within that radius for a period of seconds wouldn't trigger that geofencing. But it, of course, can lead to a lot of false positives, as it did here. This was just a person going on their morning bike ride who happened to be in the general area of a burglary. 

Dave Bittner: Right. 

Ben Yelin: So we know from the Carpenter v. United States case that we've spoken about that the government does need a warrant to collect historical cell site location information from an individual. But that doesn't necessarily apply to geofencing. What law enforcement would say is, well, we're not actually searching any data individually and using that as the sole basis for a warrant that would lead to an arrest. We're combining this location data with other indicators that would establish probable cause that a person has committed a crime. 

Dave Bittner: Right. It's important to be clear here that that first batch of information that Google turns over does not have people's names in it. It has identifiers. But... 

Ben Yelin: Right. Like the phone number, yep. 

Dave Bittner: ...It doesn't say who they are. 

Ben Yelin: Yeah. So that might be one sort of mitigating factor of geofencing. On the other hand, this person had to go through a pretty significant ordeal because Google sent this person a form letter saying, basically, you have one week to issue a legal challenge against this geofencing. 

Dave Bittner: Right. 

Ben Yelin: You know, otherwise, we, as is our obligation, are going to turn this information over to law enforcement. 

Dave Bittner: Mmm hmm. 

Ben Yelin: And I will say, you know, I think a lot of companies do this, but it's not universal that they'll notify the consumer if they've received a subpoena from the government. So I think... 

Dave Bittner: Yeah. 

Ben Yelin: ...You know, that's commendable on Google's part. 

Dave Bittner: Right. 

Ben Yelin: But this person had to go through a lot. They hired an attorney, went to court, went to establish a case. I guarantee you most people do not have the resources to engage in that activity in the period of a single week. Most people will probably just have to accept the consequences. They'd get this notification and just hope that there isn't other false evidence that would lead to this person's arrest. 

Ben Yelin: If we take law enforcement at their word that they're not going to arrest somebody just because they showed up in a geofence search, then it's unlikely that this person would have been subject to arrest. But you know, without those assurances, without knowing that 100% of the time they're actually culling other evidence for - to prove in front of some sort of magistrate judge that a crime had been committed - without knowing that that happens all the time, I think it'll leave people nervous when they get that type of letter from Google. 

Dave Bittner: Are there any cases that you're aware of that are making their way through to push back on this - or anybody trying to say that geofencing is as, you know, out of bounds for the Fourth Amendment and so on? 

Ben Yelin: Yeah. So there are a lot of privacy and civil liberties advocates who have been fighting against geofencing for years. I know Electronic Frontier Foundation, the ACLU have been at the forefront of challenging geofencing. There have been some district court cases, but there hasn't been any sort of agreement among higher federal courts as to whether geofencing violates the Fourth Amendment. And until we have that sort of definitive ruling, we're going to see geofencing continue because I think it is a effective law enforcement tool. 

Ben Yelin: From law enforcement's perspective, you're limiting the universe of people who could have committed a crime, so it's a good way to to rule out suspects. 

Dave Bittner: Right. 

Ben Yelin: You know, if there were only X number of devices within a certain proximity of a burglary, that's extremely helpful information if you're trying to narrow it down. Then you say, you know, you have a description of a white male, you know, in their 30s. And you can do good detective work, police work to figure out, you know, which of the devices belong to people fitting that description. And you can sort of go from there. 

Dave Bittner: Yeah, yeah - makes sense. The article that we'll have a link to in the show notes also has information about how you can turn off Google's location history so that if you don't want yourself tracked using your - you know, if you're logged into your Google account, you have the ability to turn that off. Of course, like so many things in this area, it is on by default. (Laughter). 

Ben Yelin: It is. And it's not just, you know, Google that's tracking you if you have an Android device. It's the applications themselves that track your location for a variety of reasons. 

Dave Bittner: Right. 

Ben Yelin: I know we've talked about that in the past. It's applications you wouldn't necessarily expect - the one that gives you sports scores and... 

Dave Bittner: Right - weather, things like that. 

Ben Yelin: Yeah, exactly. 

Dave Bittner: Yeah. Again, the article is "Google Data Puts Innocent Man at the Scene of a Crime." This is from the Naked Security blog by Sophos. We will have a link in the show notes. And those are our stories for this week. We would love to hear from you. If you have a question for us, our call-in number is 410-618-3720. That's 410-618-3720. You could also send us an audio file of your question for Ben or for me. And you can email us at caveat@thecyberwire.com. 

Dave Bittner: Coming up, next my interview with Admiral James Stavridis. He is the former supreme allied commander of NATO. But first, a word from our sponsors. 

Dave Bittner: And now we return to our sponsor's point about policy. KnowBe4 will tell you that where there are humans cooperating to get work done, there, you need a common set of ground rules to ensure that the mission is accomplished but in the right way. That's the role of policy. KnowBe4's deep understanding of the human dimension of security can help you develop the right policies and help you train your people to follow them. But there's always a question of showing that your policies are not only sound but that they're also distributed, posted and implemented. That's where the policy management module of their KCM platform comes in. It will enable your organization to automate its policy management workflows in a way that's clear, consistent and effective. Not only that, KCM does the job at half the cost in half the time. It's your policy, after all, implemented in a user-friendly, frictionless way. Go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. And we thank KnowBe4 for sponsoring our show. 

Dave Bittner: Ben, when I was at the RSA Conference, I had the pleasure of speaking with Admiral James Stavridis. He is the former supreme allied commander of NATO. He's a retired four-star admiral in the U.S. Navy - quite an interesting career. He's retired now, of course, but has some really interesting thoughts about where we need to go in terms of cybersecurity. Here's my interview with the admiral. 

James Stavridis: I think from a business perspective, the vendors, the various actors who are here representing an entire ecosystem recognize that because of the threat that's out there in cybersecurity, they're in a growth industry. And so particularly - at the moment, Dave, as we head toward the 2020 elections, there is a vivid example of why we need cybersecurity. And so it's a pretty upbeat crowd. It's also a hard-partying crowd. It's a great conference to come to. I go occasionally to Davos, the big global conference in Switzerland which is very stuffy by comparison. This is a fun crowd. 

Dave Bittner: Interesting. Can you give us a little bit of perspective from your history in the military and now in the private sector, the things you've seen, the growth and evolution when it comes to cybersecurity in the government space, in the military and so forth? 

James Stavridis: Yeah. Just to give you my personal history, which is a bit of a trajectory of this, I suppose - in the mid-'70s, I'm in Annapolis. And into my classroom walks Rear Admiral Grace Hopper - Amazing Grace, the mother of COBOL. And she's there to tell us about COBOL, this magical way of communicating with a computer. And of course, we do it with paper punch cards to make very simple commands. So that's the mid-1970s. 

James Stavridis: Now flash-forward to today. At every step of my career, I've seen the deeper and deeper engagement of the Navy and the other services to where we are today, which is, in my view, it is so complex and so central to everything we do that it's time for us to have a Cyber Force. Just like we have an Army, a Navy, an Air Force and a Marine Corps, I think it's time for a Cyber Force. So we've come from punch cards and BASIC as a language and COBOL as a language to a need to create a separate branch of the armed forces because of the inherent complexities of cybersecurity. 

Dave Bittner: Hm. So where do you suppose we find ourselves today? Taking the temperature of how things are in the DOD and the government sector, in your estimation, where do we stand? 

James Stavridis: I'll give you good news and bad news, and I'm going to start with the bad news The bad news is in cyber and cybersecurity nationally, we find the greatest mismatch between level of threat and level of preparation. In other words, we worry a lot about Russia, China, Afghanistan, Islamic State, piracy. Those are serious threats - high level of threat, but our level of preparation to deal with it is quite high. 

James Stavridis: In cyber, the level of threat is expanding unbelievably rapidly because the threat surface is expanding. Today there are 25 billion devices connected to the internet of things. By mid-decade, it'll be 50 billion. That's great. I can get out my iPhone and open my garage door from San Francisco. The bad news is the threat surface is huge. And we are not moving as rapidly we should, and offense is outpacing defense in my view. So I'm concerned. That's the bad news. 

James Stavridis: Here's the good news to the Department of Defense. There's growing awareness. There's growing expertise. We are moving toward the idea of a Cyber Force. And most recently - and this will sound a little wonky, but it's really important - the Department of Defense is releasing something called the Cybersecurity Maturity Model Certification - kind of a mouthful - CMMC. What it is is think of it like karate. It's a series of belts that you have to attain if you're going to do business with the government. So Level 1 is very basic. Think of it as a white belt - you've got to know what a phishing attack is; you got to have a basic resiliency plan; you have to be able to coherently reconstitute data. The levels go up to Level 5. If you want to do serious business with the government, you got to be a Level 5. That means we're going to force standards on - glad you're sitting down - 300,000 companies who do business with the Department of Defense. That's called the defense industrial base. It's an unregulated zone in terms of cyber. The department is about to regulate it. It's a profoundly good initiative. 

Dave Bittner: How do you suppose that transition is going to play out? And how long's it going to take? 

James Stavridis: It's starting almost immediately. By early summer, if you want to participate in a request for information, so-called RFI, you have to have the basics put together. By October, if you want to be in an RFP, which is pretty serious, a request for proposal - that's where you're actually presenting a bid, if you will, to the government - you have to have obtained the appropriate level for your organization, its size. 

James Stavridis: And so let me give you an example involving a company that I'm working with called PreVeil, which does end-to-end encryption. If you want to do business with the government, you're going to have to demonstrate to the government that you can move emails and file attachments that can't be attacked in the server system, which is, of course, what happens now with Gmail or any other broad-area messaging or email service. 

James Stavridis: So as we get into this, it's going to happen fast. Companies are going to need solutions quickly. And by the way, Dave, I'll close on this. It has to be not self-certification. It has to be certified by an outside observer, and that outside observer has to be certified by the Department of Defense. So this is a big change, a big system. There are going to be fits and starts in this. There'll be discontinuities. But it's a move in the right direction. 

Dave Bittner: As we look towards the horizon, looking down the road, where do you think things are headed with all of this? As the various branches of the military continue to evolve, cyber becomes more ingrained in everything that we're doing, what sort of changes do you see are going to be necessary? 

James Stavridis: I mentioned one already. That's the creation of a cyber force. Let's sketch that. I think it would be 15,000, roughly. It would be, obviously, very high-tech, very highly trained. It'll have to have a series of incentives that'll encourage people not to take a $800,000 a year job in Silicon Valley and come work in the government for 150,000. Service will be part of that. But we're going to need that cyber service. It'll be between 10,000 and 20,000 people and a budget of $10 billion, maybe 15 billion. Sounds like a lot, except we have today in the Department of Defense almost 2 million people and a budget of 780 billion. So this is, in my view, a very prudent investment. So, No. 1, a cyber force. 

James Stavridis: I think No. 2 - this is slightly technical, but it's a command and control issue that I think is important. I think it's time to separate the National Security Agency, the NSA, and U.S. Cyber Command. Currently, those two are not exactly merged, but they are both controlled and commanded by a single individual, currently General Paul Nakasone - before him, Admiral Mike Rogers; before him, General Keith Alexander. One military four-star officer who runs both the NSA and the U.S. Cyber Command - that's a mistake. The span of control is too big. NSA's mission is surveillance, essentially. It's intelligence mission. It's a Title 50 organization, quoting the law. Whereas U.S. Cyber Command, obviously, is a war-fighting Title 10 military organization. Should be a four-star general or admiral commanding U.S. Cyber Command. The NSA should be commanded by a civilian - why? - because it touches the most fundamental issues of privacy and surveillance in our national life. My view - not appropriate to have a military officer in charge of that. Ought to be a lawyer or a tech expert or some combination, someone who's trained in both those disciplines. And it ought to be recognizing that that mission is critical to our national security but also has extreme sensitivities when applied, for example, here within the United States. So that's a second big change that's coming along. 

James Stavridis: And then third and finally, I haven't really talked about this at RSA, and I haven't heard a lot of people mentioning it, but I think the big change that's coming is the addition of quantum computing, which will be an entirely new level of cryptography. It will require fundamental changes in both how we process information, but more critically, how we protect information. And we don't have time on this wonderful brief podcast to get into it. But, you know, chess, checkers, Go - quantum computing is an entirely new game. I can't even give you an analogy. It's an utterly different level. It moves us away from bits and ones and zeros to a world of infinite positions atomically and, therefore, induces entirely new ways to conduct everything from cryptography to all the mechanisms that we talk about here. 

James Stavridis: So there are three things that I think are kind of coming along, and I gave them to you in chronological order. I think we'll have a cyber force first. Then I think we'll end up splitting NSA and Cyber Command. Quantum computing is more end-of-the-decade than the first two things I mentioned. 

Dave Bittner: I have one more question for you. The world that you come from, which is a world of aircraft carriers, of fighter jets, of soldiers... 

James Stavridis: Tanks. 

Dave Bittner: ...Tanks - all of that hardware requires large investments - you know, the best people designing them, operating them. The soldiers we have trained are second to none. But that is all visible. That is all - you can look into the harbor and see an aircraft carrier, and there it is. And so in terms of expressing our nation's strength globally, those things are very easy to see. Cyber is different. And we're in this era where nations who perhaps wouldn't have gotten our attention before, for them to stand up a force in the cyber realm doesn't require - they don't have to build an aircraft carrier. They don't need the capabilities to build a jet fighter. 

James Stavridis: Correct. 

Dave Bittner: Do you have any insights on that disproportionality? 

James Stavridis: I think another way to phrase the question is, if you'll permit me, is, do we still need all that massive old-line, hyperexpensive equipment, or can we do all this with cyber? And unfortunately, I think we're going to continue to need some level of those legacy systems. 

James Stavridis: But here's the mistake people make. They tend to think of it as an on and off switch that only has two positions. Either, yeah, we just need all that big, beautiful aircraft carriers or we're just going to do it all with cyber. 

James Stavridis: Think of it more like a rheostat - you know, like a dimmer in your dining room. You got to move the needle. And I think the needle is moving away from those big, expensive legacy platforms and more toward the cyber, and there's two reasons. One is it is less expensive. We need it to defend our systems. And, critically, our opponents are doing it. And so we may find ourselves in very contentious situations in the cyber world. We got to be prepared for that. Aircraft carriers are not going to get you there. But there are going to be times when that aircraft carrier comes in pretty handy as well. You're going to need a bit of both. 

Dave Bittner: All right. Admiral, thank you so much for joining us. 

James Stavridis: What a pleasure. Thanks for doing it. 

Dave Bittner: Thank you. 

James Stavridis: All the best. 

Dave Bittner: All right. Ben, what do you think? 

Ben Yelin: First of all, I wanted to thank the admiral for joining us. It's always good for us and for the podcast to hear from somebody who's had such a great sphere of influence. And thank him for his service as well. 

Dave Bittner: Yeah, sure. 

Ben Yelin: I think it was particularly interesting for him to talk about how it would be preferable for a civilian to lead U.S. Cyber Command and the National Security Agency just because it's less of a, you know, tactical military position and more of a policy position. You have to take into consideration the balance of security and privacy. And you also have to be somebody who can work with different nation-states and, you know, other malicious actors in order to mitigate the effects of cyberattacks on our country. 

Dave Bittner: Yeah. 

Ben Yelin: So I thought that was a comment that stood out to me. I also loved his karate metaphor for how the government does business with or contracts with particular organizations based on their cybersecurity capabilities. And you get a white belt if you comply with, you know, basic best practices. If you use the full NIST framework, then you've reached a black belt, which, in real life, I have never been lucky enough to attain. 

Dave Bittner: (Laughter). 

Ben Yelin: So I thought that was particularly interesting as well. 

Dave Bittner: Yeah. Interesting to me that he's proposing this notion of a cyber force. That's not something that I've heard many people talk about. I mean, you know, famously, now we have a Space Force. 

Ben Yelin: Yup. 

Dave Bittner: Perhaps cyber is a better place for us to put our attention and our resources. I couldn't help thinking also that maybe if we do have a cyber force that their camouflage uniforms should be some of the stuff we talked about earlier in the show that hides them from facial recognition, right? 

Ben Yelin: Quite a theme for this day's show. So the state of Maryland is actually considering a bill that's currently in the state Senate that would establish a cyber reserve within the state militia, basically a group of - for now, it would be volunteers. But people with expertise would work under the command of the state militia to react to any cyber incident across the state. You know, this is something that has been tested out in other states on a pretty limited basis. But, you know, that's something I think we'll see more frequently, even if there's not a cyber force at the federal level. Bringing it under, you know, state militias, which have a unified command system, is something that I think we're going to see a lot more of going forward. 

Dave Bittner: It reminds me of what sometimes - historically, when there've been natural disasters, that the ham radio operators would step up and provide help with communications when communications were down. 

Ben Yelin: Oh, I love the ham radio people. 

Dave Bittner: Yeah. 

Ben Yelin: They're so - you know, we do in our organization a lot of emergency management exercises. And so we have to worry about redundancy in communications. And, you know, when all else fails, there are ham radio people out there you can bring in to local emergency departments who can provide that redundancy. 

Dave Bittner: Yep. 

Ben Yelin: They're always some of the nicest and best people you'll meet. 

Dave Bittner: Yeah, yeah. Agreed. And interesting idea that we could have a similar corps of volunteers who could help us in our times of need when it comes to cyberattacks. 

Ben Yelin: Right. And, you know, there are obviously a lot of logistics involved - how you would do credentialing for these people. But, you know, I think we are going to have to have some sort of unified response, whether it's at the state or federal level, to cyberthreats, you know, especially in the aftermath, where you need a strike team to come in and fix the problem in the case of, say, like, a ransomware attack or something like that. 

Dave Bittner: Yeah. Well, again, our heartfelt thanks to Admiral Stavridis for joining us. It was quite a treat to get to speak with him. That is our show. We want to thank all of you for listening. 

Dave Bittner: And, of course, we want to thank this week's sponsor, KnowBe4. You can go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. Request a demo and see how you can get audits done at half the cost and half the time. Our thanks to the University of Maryland Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.