Caveat 2.5.20
Ep 14 | 2.5.20

And the heat goes on.

Transcript

Mike Overly: You have an extremely complex set of systems that are all talking together - reservation systems, cargos systems, air traffic control, the extremely complex systems on the aircraft itself. And so all of these things can have vulnerabilities. 

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hi, Dave. 

Dave Bittner: On this week's show, Ben follows developments on the Clearview facial recognition story that The New York Times recently broke. I have a story about Amazon trying to have its cake and eat it, too, when it comes to product liability and, later in the show, my conversation with Mike Overly from Foley & Lardner. We're going to be talking about cybersecurity in aviation. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. And we'll be right back after a word from our sponsors. 

Dave Bittner: And now some reflections from our sponsors at KnowBe4. What's a policy? We've heard a few definitions. Some say it's an elaborate procedure designed to prevent the recurrence of a single, non-repeatable event. Others say it's a way the suits play CYA. Still, others say it's whatever happens to reside on those binders the consultants left behind them right before they presented their bill. How about a more positive approach? As KnowBe4 can tell you, better policy means better security, and getting the policies right is a big part of security; so is setting them up in ways that your people can actually follow. We'll hear later in the show how you might approach policy. 

Dave Bittner: And we are back. Ben, why don't you start things off for us? You got some follow up in a way. 

Ben Yelin: Sure. So we talked last week about this New York Times expose on Clearview AI, a company that scrapes facial data from social media sites - so facial images - and then allows law enforcement agencies - and they've contracted with over 600 of them - to match that data with photos either individuals have taken on the street or law enforcement has taken. And this is a tool that can be used both for law enforcement purposes, of course, and, you know, for more nefarious purposes if it gets into the hands of bad actors. 

Dave Bittner: Right. 

Ben Yelin: So this story came out about a little more than a week ago, and already, there has been a lawsuit filed at a federal court in the northern district of Illinois. And the fact that it was filed in Illinois is interesting for a reason I will get to in a second. 

Dave Bittner: OK. 

Ben Yelin: So there are a number of claims brought up by this lawsuit. I should first mention that it's a class-action lawsuit, and I know we've talked about this on this podcast before. That means the named plaintiffs in this case were able to find over a hundred additional plaintiffs to make up part of a class that has suffered a similar injury with similar facts in terms of both law and particular circumstances. 

Dave Bittner: And I suppose that's not surprising in a case that involves scraping millions of people's images online (laughter). 

Ben Yelin: Exactly. All you need to do to certify a class in that case is get people to agree to be part of that class. 

Dave Bittner: I see. 

Ben Yelin: Yeah. The potential universe of class members is almost limitless when we're talking about this type of scraping. So they filed the lawsuit in Illinois, and their claims are based on both the United States Constitution and a particular law in the state of Illinois that was just passed recently. So the state of Illinois has this relatively new statute called the Illinois Biometric Information Privacy Act. The intent of that act is to safeguard residents from having their biometric data used without their consent. So non-biometric personally identifiable information can be changed, can be concealed. 

Ben Yelin: What makes biometric data unique is that it can't be changed, and that presents some sort of unique risk for the user. So the state of Illinois was proactive on this issue. They set, you know, sort of a new nationwide standard and basically said that a user had to consent before their biometric data was used by a third party. And if a third party used that biometric data without their consent, there would be financial penalties to that third party - criminal penalties. This lawsuit alleges violations of that statute, which is, I think, a valid claim and a claim that, in my opinion, has a pretty substantial likelihood of succeeding. 

Ben Yelin: Then they also throw in a bunch of constitutional claims. So they claim it violates the right of contract - that's part of our federal constitution - because users, you know, when they agree to the terms of services of these social media companies aren't agreeing for their personal images to be scraped as part of a scheme to capture their biometric data. There's a First Amendment claim and a Fourth Amendment claim against unreasonable searches and seizures, as well as a 14th Amendment claim about a violation of due process rights. They threw these constitutional claims in the complaints in my view because they wanted this case to be broader than just the plaintiffs in the state of Illinois. 

Ben Yelin: So only the Illinois plaintiffs have a cause of action as it relates to this Illinois BIPA law. But in order to have a broader class of plaintiffs, a national class of plaintiffs, you have to make these constitutional violation allegations. And having read through the complaint myself, those allegations seem to be rather haphazard and relatively weak, which leads me to believe that we might get some sort of resolution on this case where the constitutional claims are dismissed, there isn't some sort of broader nationwide injunction against Clearview AI, but perhaps there's an injunction within the state of Illinois for violating this BIPA statute. 

Dave Bittner: Now, could Clearview be on the hook for penalties? Could this hit them financially? 

Ben Yelin: Absolutely. So the plaintiffs here have a couple of prayers for relief, which is legalese for what we want out of this lawsuit. The first is an injunction to stop Clearview from collecting biometric data in the first place, to stop them from engaging in these scraping practices. The second prayer for relief is both monetary and punitive damages. So Clearview AI would have to recompensate the class of plaintiffs for any actual injuries that they've caused. And on top of that, the plaintiffs want there to be punitive damages as a disincentive for Clearview AI to be involved in this type of scraping in the future. If they were to lose this lawsuit, they would certainly be subject to significant financial penalties. 

Ben Yelin: But, of course, you know, the amount of financial penalties would be determined, you know, at a much later date in this legal proceeding once a court has a better idea of the actual injuries that have been suffered by the plaintiffs and the damages would depend on which of the claims are successful. The Clearview AI would be on the hook for a much larger amount of damages if these constitutional claims were successful, which, as I said, I don't think they're going to be. Their claim under the Illinois statute might actually lead to lesser damages, but it does have a higher likelihood of succeeding in court. 

Dave Bittner: Now, how does something like this roll through the system? And I'm wondering specifically is this the sort of thing that a judge would say to Clearview, hey, while this is working its way through the system, knock it off. You have to stop scraping this. Or would they still be able to go business as usual while it was making its way through? 

Ben Yelin: So that's a great question. In order for the court to stop Clearview AI from engaging in these practices in the short term, there would have to be what's called a preliminary injunction where Clearview AI is demanded to cease engaging in this behavior until the lawsuit reaches some sort of resolution or conclusion. I do not see that as a likelihood in this case. In fact, the plaintiffs in this case aren't arguing for any sort of preliminary injunction. Preliminary injunctions are very rarely granted. You have to not only allege that you're going to be successful on the merits of the case but that in the interim while the case is being litigated, you would suffer some sort of irreparable harm. And that's very, very difficult to allege in most circumstances. So, yes, in the meantime, while we wait for a resolution of this case as it makes its way through the court system, Clearview AI is under no obligation to stop doing what it's doing. And the legal system, as we know, can be very slow. 

Ben Yelin: So even if this case in the district court is successful, Clearview AI would certainly appeal it to the Federal Court of Appeals. You know, they might have a three-judge panel hear the case, you know, and if that doesn't please one of the parties, the parties can demand that the case be reheard by the full panel. And this is a process that could take years, you know. And that's just sort of the pitfall of our legal system. And unless you get that preliminary injunction for which there is a very high bar, this alleged bad activity on the part of Clearview AI, which the allegations are they're violating not only Illinois law but people's constitutional rights, is allowed to continue. 

Dave Bittner: Now, I know when it comes to class-action suits that - I don't know - it's almost a joke that the only people who get rich are the lawyers, right? 

Ben Yelin: Yes. 

Dave Bittner: But I suppose there's also, yes, the lawyers might get rich, not the plaintiffs, but it could be a serious effect on Clearview AI's bottom line. 

Ben Yelin: It is. And, you know, without engaging in a class-action lawsuit, no individual plaintiff would get any sort of relief in these circumstances. Your average Joe Schmoe plaintiff just does not have the necessary resources to engage in a large-scale lawsuit against a company of this magnitude. The lawyers they would have to hire, the filing fees, it would just be prohibitively difficult. And Clearview AI probably wouldn't take the case seriously. You know, if you or I sued them, they could probably bully us out of court and offer us a settlement so that we wouldn't continue with a lawsuit. That becomes much more difficult when you have a large class of plaintiffs. And that's why you see these class-action suits against large companies like Clearview AI. 

Dave Bittner: Yeah. All right. Well, this is one that continues to develop. My story this week comes from The Verge, an article by Colin Lecher, and it's titled "How Amazon Escapes Liability for the Riskiest Products on Its Site." This fascinates me, so I want to come at this from a couple different directions. First of all, let's say I go to Home Depot and I buy myself a heater to warm up my basement and that heater malfunctions and burns my house down. And I decide I want to go after people for making a faulty heater, right? I guess I could go after the manufacturer... 

Ben Yelin: Yes. 

Dave Bittner: ...But I could also go after Home Depot, yes? 

Ben Yelin: Absolutely, yes. As the seller, they would be liable in those circumstances for selling you that faulty heater. 

Dave Bittner: Now, what Amazon is doing here is saying that because they have part of their store that they call Amazon Marketplace, which Amazon says is basically just a pass-through. In other words, they're facilitating individual private sellers to connect with Amazon's audience of buyers. But Amazon really doesn't have anything to do with this other than making that connection. Sure, they're handling the money, (laughter) right? 

Ben Yelin: They're facilitating the entire transaction. 

Dave Bittner: Right, right, yeah, attracting people to the site, you know, running the algorithms to put the products in front of people. We're being a little silly here, but this gets to this thing that comes up time and time again with these big online providers, which is what are you? Are you a platform or are you, in Amazon's case, a store? Help me understand what's going on here and the case that Amazon's trying to make. 

Ben Yelin: So Amazon - and it has been very successful in making this argument - has argued in court cases across the country that they are simply a passing intermediary. 

Dave Bittner: OK. 

Ben Yelin: You have a seller and a buyer who engage in this public marketplace, and Amazon facilitates the Marketplace, but Amazon, the way they argue it, has no active role. They're not selling the product. They make no claims that they have vetted the safety of the product, that they've really done anything to prepare that product to go to market. And as I said, this defense has been largely successful in court cases across the country. That's starting to change because as legal scholars and certain courts look at particular transactions, third-party transactions on Amazon, they're realizing that Amazon does actually play a larger role than simply an intermediary who is facilitating this action. So in other words - and this article makes an apt analogy - they're not simply Craigslist. They're not a place for people to connect, you know? 

Dave Bittner: Right. I was thinking in the real world like a flea market, you know? 

Ben Yelin: That's a great example. 

Dave Bittner: Yeah. 

Ben Yelin: So the person who organizes a flea market going to be liable when the scarf that you buy actually has some mechanism that strangles you... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Which I now have a great idea for a horror movie. 

Dave Bittner: Right, exactly (laughter). 

Ben Yelin: So they're not. They are that intermediary. And this is what Amazon has tried to argue. Now, as you mention, in the real world, they're not simply an intermediary. Not only are they facilitating the sale, as you said, their algorithms are helping customers find these third-party items. These products are advertised on Amazon's Marketplace. For those of us who shop on Amazon, it's frequently unclear whether the product you're purchasing as a third-party product or an Amazon product. 

Dave Bittner: Right. Right. And I think that's an important point because if anybody goes to Amazon and says, you know, I want to buy a charging cable for my iPhone, well, it's going to list a bunch of them. Some of them probably come directly from Amazon's warehouses. Some of them may come directly from some seller in China. 

Ben Yelin: Right. And they're all on that same list. 

Dave Bittner: Right. 

Ben Yelin: So you see a list of 20 items. How many of us are going to read closely and see, you know, which ones come from third-party sellers and which ones come from Amazon directly? And should legal liability depend on whether people actually discover the origins of that product? 

Dave Bittner: Right. 

Ben Yelin: What scholars - and this article mentions a group of scholars - are trying to argue is because of the unique aspects of an Amazon search where you get this list, the third-party products are part of this integrated marketplace, the marketplace sort of coexists in the universe where Amazon is conducting sales of its own products... 

Dave Bittner: Right. 

Ben Yelin: ...We should have a standard developed where Amazon can be liable the way any other retailer can be liable. And obviously this would expose Amazon to liability. They would pass on the cost of that liability to their consumers, which would be the negative consequence for people who aren't Jeff Bezos. 

Dave Bittner: And perhaps not carry as many things in the store as well, right? I mean, they could become more restrictive because they have to. We're not going to carry these things that are going to burn your house down anymore, right (laughter)? 

Ben Yelin: Right. Exactly. Now, you know, there was an allegation in this article that a study was done that something like 4,000 banned, unsafe and mislabeled products were on the company's platform I think in the past year. That's a lot. So if you changed the legal standard to allow Amazon to be liable not just as an intermediary but as a seller in these circumstances, the idea is that would give them an incentive to make sure that the products that they bring to market are safe. That's why we have products liability in general. 

Dave Bittner: Yeah. 

Ben Yelin: We want retailers to make sure that they're not selling faulty products. And the way we do that is to incentivize them by potentially making them liable. Because of the scale of Amazon, we're talking about billions and billions of products. And if there is a process where third-party products have to be vetted and Amazon has to hire additional resources to vet those products, vet them for safety, compatibility, et cetera, that's going to be an additional cost to them. And as I said, that cost is passed down to the consumers. That's what Amazon almost certainly will argue in future court proceedings. They wouldn't be able to facilitate this market if there were a threat of liability. 

Dave Bittner: Right. But I can imagine, though, like, if I'm Home Depot - going back to our earlier example, if I'm Home Depot, I would be saying, yes, absolutely, Amazon, you know, it's unfair to me trying to compete with Amazon that I have to vet my devices, that I have to pay for insurance for the potential liability of these devices that I carry, and Amazon does not. 

Ben Yelin: Yeah. Now, a lot of people in the past several years on a variety of topics have made the argument that such and such is not fair. Amazon has an unfair advantage. 

Dave Bittner: Yeah. 

Ben Yelin: And they almost always do, you know. So local retailers are subject to sales tax. That created a big controversy. 

Dave Bittner: Yep. 

Ben Yelin: Amazon would headquarter itself in states with no sales tax. That would give themselves a competitive advantage. And that was unfair to brick-and-mortar retailers. That's sort of the same thing here. Home Depot is not going to be able to make a credible argument that they're simply a pass-through, they're simply facilitating the sale between the customer and Black & Decker. 

Dave Bittner: Right, right. 

Ben Yelin: I mean, that would be laughed out of court. It's not up to us to say whether it's an unfair competitive advantage, but we can say that it is a competitive advantage. What they would argue is they're adding value to the marketplace for both the retailers, the third-party retailers, and the consumers because without them, these retailers would never reach the consumers. So Home Depot has all sorts of ways of reaching us to sell us their products. 

Dave Bittner: Right. And that weird little part that there really is no broad market but that I need to fix my dishwasher, Home Depot would have no reason to stock that in their store. 

Ben Yelin: Exactly. 

Dave Bittner: Yeah. 

Ben Yelin: And because of scale, Amazon will stock these products. If one person needs it, there is an incentive for that seller and the buyer to get together on Amazon's website. 

Dave Bittner: Do you have any sense that we're starting to chip away at this notion of these online giants of hiding behind or - maybe that's too judgmental a term but making use of this notion of being a platform. Are we starting to see some erosion of that? 

Ben Yelin: I think so. I mean, this is not only a question as it relates to Amazon. I mean, we've talked about Section 230, the Communications Decency Act, where these social media platforms argue that they shouldn't be liable because they're just the platform. They're just the venue. And there's been pushback for it, particularly because the consequences of treating these entities as simple platforms means that they can get away with metaphorical murder. They can post false information on their website and not be held liable. They can potentially cause a lot of physical, tangible harm to somebody and wouldn't be liable simply because they're just the platform. 

Dave Bittner: Right. 

Ben Yelin: And we've seen pushback to that among the general public, among legal scholars and policymakers because I think we've started to realize the power that these big tech companies have. You know, they may call themselves intermediaries, but when Amazon is controlling the entire process of the sale, they're using their own algorithms. You know, they're using their own Buy Box to facilitate the scale. They really do play a large role. And so it's not fair to the consumer or, you know, necessarily to the third-party seller to have Amazon just sort of sit back and say, hey, we're just the flea market. 

Dave Bittner: Right. 

Ben Yelin: So, yeah, I do think there is a broad push back against that in a bunch of different domains. 

Dave Bittner: All right. Well, we will have links to these stories in the show notes. It is time to move on to our Listener on the Line. 

0:20:24:(SOUNDBITE OF PHONE DIALING) 

Dave Bittner: This week, our Listener on the Line actually wrote in to us. This is a listener named Brian (ph). And he wrote us and he said, hey, all, curious about personal device privacy - if a user logs in to their Office 365 or Gmail account on their phone, is their device then subject to company monitoring and police seizure if somehow the situation comes up? Do they lose all privacy on their own device just because they access work data on it? Thanks very much, Brian. Ben, what do we got here? 

Ben Yelin: So good question, Brian. We sort of did the reverse corollary to this question on a previous episode talking about bring your own devices where it was a device that you used for work purposes and whether you retained personal privacy in that. This question is sort of closely related to that but a little bit different. 

Dave Bittner: Yeah. 

Ben Yelin: The upshot is generally as it relates to work-related emails, like Office 365 or a Google account, your employer does exercise a certain level of control. So, for example, you know, when you enter an agreement with your employer and you're, you know, probably signing their computer use policies, you're often consenting for their admin to remove corporate data from the device, you know, assuming that that removal of data doesn't affect any personal information. They also in some circumstances can institute a factory resetting of a device. In order for that to happen, the user would have to consent to their employer taking that action. But as we've talked about in a number of other circumstances, it's not always apparent to the employee, you know, that's a threat that's coming from their employer. 

Dave Bittner: So the employer has access to the things that have to do with work - my business emails, those sorts of things and documents - but they would not have a right to my Angry Birds high scores. 

Ben Yelin: They do not... 

Dave Bittner: OK (laughter). 

Ben Yelin: ...If that is your personal device. 

Dave Bittner: Yeah. 

Ben Yelin: If it is a work-issued device, then it becomes a different story. And... 

Dave Bittner: I see. Because they own it. 

Ben Yelin: They own that device. But in the circumstances for this question, we're talking about somebody's own personal device, where they're merely accessing a work-related server. And in that case, the employer still retains control over the server, but they don't have any access to the rest of those users' applications. 

Dave Bittner: Yeah. And I have seen situations where, for example, you lose the device; work has the ability to basically wipe the device remotely, you know, so that if someone finds it or someone stole it, they can't get any of the data on the device. 

Ben Yelin: Right. So that's sort of the nuclear option here. That's the worst thing that could happen to a user. That's very unlikely. I mean, it's just a circumstance that you don't see very often. But again, that's going to be part of whatever the computer use policy is between the user and its employer. 

Dave Bittner: Yeah. 

Ben Yelin: So I would say, you know, if this is something you're potentially worried about, do what most people do not do and read your organization or your company's computer use policy very carefully. 

Dave Bittner: (Laughter) Right. Right. All right. Well, thank you, Brian, for sending in that question. We would love to hear from you. Our "Caveat" call-in number is 410-618-3720. You can call and leave us a message there. You can also send us an audio file. That's at caveat@thecyberwire.com. Coming up next, my interview with Mike Overly. He's from Foley & Lardner. We're going to be talking about cybersecurity in aviation. 

Dave Bittner: But first, a word from our sponsors. And now we return to our sponsor's point about policy. KnowBe4 will tell you that where there are humans cooperating to get work done, there you need a common set of ground rules to ensure that the mission is accomplished but in the right way. That's the role of policy. KnowBe4's deep understanding of the human dimension of security can help you develop the right policies and help you train your people to follow them. But there's always a question of showing that your policies are not only sound but that they're also distributed, posted and implemented. That's where the policy management module of their KCM platform comes in. It will enable your organization to automate its policy management workflows in a way that's clear, consistent and effective. Not only that, KCM does the job at half the cost in half the time. It's your policy after all; implement it in a user-friendly, frictionless way. 

Dave Bittner: And we're back. Ben, I recently had the pleasure of speaking with Mike Overly. He is an attorney with Foley & Lardner, and he has a specialty in cybersecurity when it comes to aviation. Really interesting conversation here with Mike Overly. 

Mike Overly: There have been a couple of GAO reports which may have driven your request for this discussion today - one in April 2015 and the other in July 2018. And to say that they describe something of a dire situation is not necessarily an overstatement. Both are calling for a comprehensive approach to cybersecurity in the aviation industry. The 2018 report is literally saying urgent action is required, that there are significant security weaknesses that need to be resolved. So there is definitely knowledge of a problem in the industry. The issue is that the airline industry is extraordinarily complex. 

Mike Overly: So for example, who's responsible for information security? Well, that's a great question. Is it the airlines? Is it air traffic control? Is it the cargo systems? Is it the reservations systems? There are so many possibilities and so many places where security issues may arise. And so you have an extremely complex set of systems that are all talking together - reservation systems, cargo systems, air traffic control, the extremely complex systems on the aircraft itself. And so all of these things can have vulnerabilities. 

Mike Overly: An easy example is the air traffic management system itself, which is actually quite old. And so there are calls for this to be significantly upgraded. But you can imagine that, first of all, that's a complex task and, second of all, it's not done overnight. So I think a lot of these sort of vacillation that we see, where we have these reports coming out year upon year and we're not seeing a ton done, is because of the challenges involved. And I think that no one wants to, for example, rush in and make a change and, by making that change, we're now creating additional vulnerabilities that no one necessarily had originally contemplated. 

Dave Bittner: It's interesting to me because certainly, in this post-9/11 world, there's been no shortage of security focus on the airlines. And so I'm wondering, is this a matter of folks kicking the can down the road or pointing fingers at each other? How does it keep getting pushed back year after year when people know there's an issue? 

Mike Overly: I think that everyone is making a good faith attempt to resolve the problems. But even the issue itself - the sort of, what is it that we're worried about? And so the first thing that likely comes to your mind is some kind of cyberattack on an aircraft in flight causing damage to individuals and property. But there are many other possibilities. For example, we've had Cathay Pacific or British Airways, both of which have been hit with massive privacy compromises, where Cathay in 2018 had 9.4 million customer records compromised; British Airways had close to 400,000 reservation records compromised. So there are the privacy aspects of travel. 

Mike Overly: There are trade secrets, and you would almost never come up with that as a possible risk. But it has been shown that, for example, in aviation, that by the time a new aircraft, whether military - and I emphasize this - whether military or commercial, rolls down the runway for the very first time, the complete plans and specifications for that aircraft have likely already been stolen and are already being used by foreign governments to create their own aircraft. And so there's the problem with compromises of security regarding the technology that goes into these aircrafts. We had a situation where a Polish airliner was grounded for, of all things, a denial-of-service attack, and they couldn't take off. So there are many variations here that, of course, we all live in fear of someone potentially taking control or disrupting the control of an aircraft in flight. But there are privacy issues. There's the trade secret intellectual property issues. There are the possibilities of simply grounding aircrafts. 

Mike Overly: And then you have the somewhat bizarre problem that has made the headlines recently of drones, which almost anyone can purchase and fly and directly interfere with an aircraft without ever affecting the systems, if you will, of that aircraft. In other words, you don't have to be a super-hacker to potentially interfere with air travel; you just need a few hundred dollars and some moderate skills in flying a drone, and you can do so. 

Dave Bittner: Can you help me understand the situation when it comes to domestic versus international air flight? Does the U.S. FAA pretty much lead the way when it comes to global air traffic? What do we need to understand when it comes to that? 

Mike Overly: Well, it's a great question. I mean, if you think about it, all too often we're very U.S.-centric. What does the U.S. need to do to get better at cybersecurity in the aviation industry? But the truth of the matter is - look at the volume of traffic in any given day coming into the United States with foreign aircraft and with aircraft which have been serviced and are subject to control with offshore resources. And then, of course, you have our own aircraft moving abroad and through these other airports and through these other systems. 

Mike Overly: And so talk about complexity - I mentioned earlier just how complex it is if we're just looking at the United States. Now we multiply that by having many other jurisdictions involved. I think if we were to look at it, the U.S. is certainly, by no means, you know, sort of bringing up the rear as far as, you know, efforts to try and get better at information security and cybersecurity with regard to aviation, but I don't think we're at the top, either. I think that there are other countries that are far more skilled in this area. There are countries that have had just traditionally been more security-centric - for example, Israel and other areas where, you know, they've had these attacks going on for many, many years, and they are much more careful about control of baggage, for example. 

Mike Overly: I mean, that's another problem, which is, you know, we all worry about the potential for an explosive device being brought onto a plane. And the reality is that there are scanners that are available that do a OK job with identifying these things, but do we have sort of end-to-end protections against people moving an explosive device either through baggage or directly onto the plane with a passenger? Do we have a good 100% solution there? No. Does that technology exist, and can we deploy it? Likely, but it's going to take a while to get that at every single airport, and it's going to cost billions of dollars to deploy and means we're going to be spending a lot of time before this is really rolled out. 

Dave Bittner: You know, I think up until recently, when having conversations about aviation, folks I would talk to, you know, we'd often sort of jokingly say, well, you know, it's not like airplanes are falling out of the sky. But I think the situation that we've seen with Boeing, with the 737s - which granted is a software issue but still, you know, related to all of this, I think - we've had some airplanes falling out of the sky. First of all, do you think that's a fair assessment, that a software issue should be grouped in with these sorts of things? 

Mike Overly: Absolutely. And I think it's a great point. If you look at the Carnegie Mellon, the CERT organization that they've formed for information security, cybersecurity, they estimate that, on average, for every thousand lines of software that's written, lines of code, there can be anywhere from one to 10 bugs. And each of those bugs might give rise to a vulnerability that could be exploited by a hacker, or it simply could be a situation where it causes malfunction of the code. So that's a thousand lines of code. And now let's think about how many lines of code are actually in one of today's more complex planes, and we're talking about potentially hundreds of thousands of lines of code. 

Mike Overly: So you've got - as was demonstrated by the recent problems, that you've got software that is highly complex, which by definition is going to have bugs in it and that could cause potential failures in flight, as we've seen, or it could create vulnerabilities where a third party could access and gain control of the plane or cause some other changes in the way the plane is acting. And so you have all of these things coming together. But yes, I think every security expert would say that one of the key elements here is the complexity of the software involved and the fact that it can have bugs; in fact, it almost certainly does have bugs and, in fact, potentially hundreds if not thousands of bugs. 

Dave Bittner: So in your estimation, where is a good place to begin? With a system this complex, is there such a thing as low-hanging fruit? 

Mike Overly: Well, I think there is low-hanging fruit. And, you know, some of the simplest things are better control over boarding systems and networks. I think that many people are pointing out, you know, having better review of passengers before they're granted access and rolling out as quickly as possible technology which currently exists today which gives us better control over an insight into whether we're putting harmful devices on a plane and its cargo hold. I think those, everyone would agree, are very straightforward fixes. 

Mike Overly: I think the other thing is that we really need to look at, you know, when you're talking about combining many, many different systems on a plane, how is all of that checked for security vulnerabilities? And there are certain standards that one can check against - NIST has them; MITRE has them - that, you know, are sort of lists of common vulnerabilities. And there should be requirements that every single one of these systems be checked for those common vulnerabilities before they're ever deployed in an aircraft. And so there are some sort of low-hanging fruit, as you describe it, that I think can be picked and can materially reduce risk. 

Dave Bittner: Are we in a situation where you see us getting caught up on this? Are you hopeful? 

Mike Overly: I'm very hopeful because, as they say, it's not as though we have a problem and everyone's throwing up their hands, that you have the Department of Homeland Security that has said they - in 2017, they passed - or discussed an aviation cyber initiative where they're going to look at both civil and military aviation and try and come up with solutions to common problems. And I think that's a great effort. And so we've got the GAO on top of those that are not just putting out reports that are sort of generic in nature but very specific as to the types of things that need to be remedied. 

Mike Overly: You've got DHS coming up with their own suggestions in this regard, and you've got the airlines. I mean, the last thing anyone wants is a problem and, certainly, nothing occurring during midair travel. And so I think everyone's extremely cautious in this area. And so I think everyone's motivated to do the right thing. So I'm actually quite optimistic. I just want to see - better, I think, coordination might be the key to getting this moving, that we do have air traffic working with airlines, working with cargo companies, working with reservation systems so that all of this, you know, is addressed in a uniform fashion and that we don't have 10 different approaches to information security, but there's some unification. 

Dave Bittner: All right, interesting stuff. Ben, what do you make of all this? 

Ben Yelin: Well, I just Googled cost of train ride to California. 

0:37:03:(LAUGHTER) 

Ben Yelin: Apparently, it's relatively expensive and would take me, you know, more than three days. So I'm going to have to bear it on my trip home. 

Dave Bittner: I see, OK (laughter). 

Ben Yelin: So yeah, there were - I appreciated him giving us some hopeful messages at the end there. It seems like the entities involved here, the FAA and the airlines, are aware of the magnitude of the problem. But, you know, I think because of the potential consequences of either faulty software or cyberattacks on our airlines, there's just so much risk involved, and it's kind of, frankly, a little bit terrifying to think about. 

Dave Bittner: Yeah. I wonder, though, how much of it is that just general fear of flying than I think is so easy to latch on to. You know, there are - despite all of this, there are thousands of flights every day. And even though what we said - despite the problems that Boeing's had with the 737, which are grounded right now... 

Ben Yelin: Right. 

Dave Bittner: ...For those problems. 

Ben Yelin: The 373 Maxes, yeah. 

Dave Bittner: For the Maxes, yeah. I mean, travelling by air is extraordinarily safe. 

Ben Yelin: It is, yeah. One of my favorite Onion articles of all time was - it was something like, airline passengers who were 100 times more likely to die in car accidents die in fiery airline crash or something like that. 

Dave Bittner: (Laughter). 

Ben Yelin: So yeah, I mean, I think we're naturally more fearful about it... 

Dave Bittner: Right. 

Ben Yelin: ...Because having a giant orb suspended in the sky... 

Dave Bittner: Right. 

Ben Yelin: ...Just seems like something that should be more dangerous to us. 

Dave Bittner: Yeah. 

Ben Yelin: I mean, one thing that his interview I think made clear - and it's something people always remember with air travel - is it's just a very tightly controlled process. There are greater barriers to entry for passengers than any type of transportation. We have no-fly lists. We have TSA security. Air traffic control itself is tightly regulated. You know, the FAA is very good at what it does, and that's reflected in the statistics. We do have far fewer airline-related fatalities than almost any other type of transportation system, especially fewer than automobile travel. 

Dave Bittner: Yeah. 

Ben Yelin: You know, one thing he mentioned is airplanes have hundreds of thousands of lines of complex code, and that just introduces a whole new level of vulnerability. And because of the capability of cybercriminals - you know, particularly when we're talking about something like denial-of-service attacks, which he referenced here - the risk is so high that even if the likelihood is low, I think it gives us a reason for valid concern. And, you know, I think it's appropriate that the GAO in both of these reports, particularly in 2018, has tried to press upon us that this is an extremely urgent matter, and if we don't take an interdisciplinary approach or an interagency approach to help fix this problem, the consequences could be dire. And after 9/11, I think we've all taken these types of warnings from government agencies more seriously, and that's reflected in air travel as we know it today. 

Dave Bittner: Yeah, I'll share a personal story that really didn't have any direct safety issues. But I spent two hours sitting on the tarmac on a plane because we could not take off because the pilot could not download his flight plan from his iPad. 

Ben Yelin: Wow. 

Dave Bittner: Yeah. And it was all because of slow internet. 

Ben Yelin: (Laughter). 

Dave Bittner: And the pilot, you know, to his credit, kept coming on the intercom and saying, ladies and gentlemen, we're doing our best (laughter). He was... 

Ben Yelin: We're at 74% loaded now, yeah. 

Dave Bittner: I mean, he was just as frustrated as everyone. But I think what it speaks to is the interdependency of these systems. For whatever reason, the internet was slow that day, and because of the rules, the good rules, that are there to help keep us all safe, we had a plane with a couple of hundred people who couldn't get to where they were going on time because we're waiting on a dial-up connection or something, right? (Laughter). 

Ben Yelin: I mean, first of all, I had no idea that that's how that actually works. So let's hope that the Wi-Fi is better at your local airport. 

Dave Bittner: Right (laughter). 

Ben Yelin: But yeah, I mean, one thing that - as you say, one thing that does invoke for us is how tightly managed and controlled the entire process is. I mean, if I hear a weird sound in my car, I'm still going to drive to work. There could be a clanking... 

Dave Bittner: (Laughter) Right. Right. 

Ben Yelin: ...And, you know, the car could be careening in all different directions. 

Dave Bittner: Right. 

Ben Yelin: And I'm like, eh, it's probably fine. 

Dave Bittner: Yeah, just put a piece of black electrical tape over that check engine light. It'll be fine. (Laughter). 

Ben Yelin: Yeah, that does not fly, you know, for that - and not to use a terrible pun, but that does not fly for aviation... 

Dave Bittner: Right. 

Ben Yelin: ...Just because of the risks involved. 

Dave Bittner: Right. 

Ben Yelin: So, you know, I've been delayed on flights because the seat isn't properly secured to the armrest or something like that. And it's frustrating, but it's because, you know, we understand the scope of the risk. 

Dave Bittner: Yeah. 

Ben Yelin: And we realize that not only are you talking about a vehicle that carries hundreds and hundreds of people, but it also can potentially be used as a missile. 

Dave Bittner: Right. 

Ben Yelin: And now we, you know, know that it can be used as a cyberweapon. So that's why I think everything is so tightly controlled. And even though it's very frustrating, I think it's almost - we almost feel better when we hear that we're not taking off unless all the I's have been dotted and the T's have been crossed. 

Dave Bittner: Yeah. Yeah, we have good understanding. All right. Well, again, thanks to Mike Overly for joining us. He is from the law firm Foley & Lardner. Really appreciate him spending the time with us and helping illuminate some of these issues. That is our show. We want to thank all of you for listening. 

Dave Bittner: And of course, we want to thank this week's sponsor, KnowBe4. If you go to kb4.com/kcm, you can check out their innovative GRC platform. That's kb4.com/kcm. Request a demo and see how you can get audits done at half the cost in half the time. Our thanks to the University of Maryland Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.