What's next in the new era of Cyber Disclosures?
Paul Innella: Activity is not achievement. What we are doing is not how well we are doing. And that's what has to change. And I think if you can drive a framework around making that change happen, all of these problems will start to go away.
Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's privacy, surveillance law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: Today, Ben shares a court decision on LinkedIn data scraping. I've got the story of the Ukrainian military using facial recognition software. And later in the show, my conversation with Paul Innella. He's CEO of a company called TDI. We're discussing risks to members of boards of directors.
Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: All right, Ben, we got some good stories to cover this week. Why don't you start things off for us?
Ben Yelin: So this story goes back quite a ways, to the early days of this podcast. I got the article from a news source called CyberScoop.
Dave Bittner: Yep.
Ben Yelin: And the article is by Tonya Riley. So we covered this case probably back in 2020. It is hiQ Laboratories versus LinkedIn. If you'll recall, hiQ is an organization that scrapes data from publicly available LinkedIn profiles. So they make a list for organizations of what are called keepers, people who they think are at risk of getting hired for better jobs. So it's kind of intel for an organization. They scan a LinkedIn profile. So-and-so's very impressive. Maybe they're underpaid based on their academic credentials or work history. So they do that type of data analytics for organizations, based off publicly available LinkedIn profiles.
Dave Bittner: OK.
Ben Yelin: So LinkedIn was not happy about this. They sent a cease-and-desist letter back in 2019 to hiQ saying, you can't do this; this is stealing our proprietary data, and this puts our users' information at risk. In turn, hiQ sued LinkedIn for declarative judgment, saying that this was OK. I'm not going to get into the complicated procedural history, except to say that eventually this made its way up to the Supreme Court. The Supreme Court kicked the case back down to the 9th Circuit Federal Court of Appeals, which is located on the West Coast, saying, guys, we just had a decision on the Computer Fraud and Abuse Act, the Van Buren case; please reconsider the facts of this case in light of what we decided in Van Buren.
Dave Bittner: Oh, I see.
Ben Yelin: And if you'll recall with the Van Buren case, the court determined that it was not a CFAA violation to being a part of a website where you had been - maybe you're there for the wrong reason, but you have access to be there in the first place. That is not a violation of the Computer Fraud and Abuse Act. So in that...
Dave Bittner: So you've been previously authorized to look at something, and if you go and look at something, even if it's not the reason that you're originally authorized for, you're off the hook.
Ben Yelin: Exactly. That was a much better way of explaining it than my way.
Dave Bittner: (Laughter) Got your back.
Ben Yelin: The way the way the Supreme Court described it is sort of a gate-up, gate-down approach. So if the gate is up and it's something that's not publicly available but you hack into it - that was sort of the original purpose of the Computer Fraud and Abuse Act - that is a violation of the law. If you have access to it and you get in a database or a system for the wrong reason, even if you're using it to stalk your ex-girlfriend, that is not a violation of the CFAA under the Supreme Court's interpretation. So this case was kicked back down to the 9th Circuit, and the 9th Circuit panel decided that the Supreme Court's precedent in Van Buren meant that it had to decide the case in favor of hiQ Laboratories, with the thinking that the gate was up here. There was no restrictions on the information that hiQ was seeking to obtain because the data that they were scraping was publicly available. It was on profiles where a person hadn't altered the requisite settings to make their information private, for example. It was...
Dave Bittner: I see. So it's like putting up a billboard and - but if I put up a billboard that says, don't read this sign, I have no legal recourse against people for reading the sign (laughter).
Ben Yelin: Right. Whereas if somebody put a curtain over the billboard...
Dave Bittner: Right (laughter). On private property.
Ben Yelin: ...And you went and pulled down the curtain...
Dave Bittner: Right (laughter).
Ben Yelin: ...Yeah, that would be trespassing and vandalism. And that's really the approach - it's a silly metaphor. But that is the approach the Supreme Court is taking. They're relying on what they see as the legislative history behind the Computer Fraud and Abuse Act, which was about hacking. And it's more of a traditional understanding of being in a place where you have no right to be. It's being on somebody else's property, even when a gate was up. Now, where this gets complicated is what LinkedIn was trying to argue in their proceedings - is they wrote the cease-and-desist letter saying, you cannot do this. They eventually revised their terms of service, saying this is not available for data scraping.
Ben Yelin: So what LinkedIn is saying is that was the gate. We put up the gate. We put up a gate saying hiQ cannot obtain this data via scraping. We're not going to allow that practice on our service. What the 9th Circuit Court of Appeals is saying here is that's not a gate up because the information itself is still publicly available. If you want to protect information on your site, then you have to make it private, whether that's done by the user or whether that's done by LinkedIn through some of their own internal mechanisms to block access to people's publicly posted data. They can put up those gates, but they haven't done so. So you can't artificially construct a gate to block a certain entity from using your service just for that one individual purpose if that makes sense.
Dave Bittner: Right. Yeah.
Ben Yelin: So they're taking the gate-up-gate-down concept extremely literally. And as a result, hiQ is going to be able to continue to scrape data from LinkedIn. It's a very valuable service that they're providing. A lot of people want aggregated information on a LinkedIn profile, so there was a lot of money at stake. There's certainly a lot of money at stake for LinkedIn, too, because they have a proprietary interest in this. There's - at least conceivably, if people are getting aggregated data from LinkedIn, there's less of a chance that somebody is going to visit LinkedIn and look at people's profiles, see ads, et cetera, et cetera, pay for a premium service, that type of thing.
Ben Yelin: So it's really interesting to see how the Van Buren decision from the Supreme Court is applied in a case that dates to - at least the facts and the case data prior to that Supreme Court decision. So very interesting stuff. I don't think it's going to make it back to the Supreme Court. I think this case, the way it was decided in the 9th Circuit, conforms pretty closely to what the Supreme Court said in the 6-3 Van Buren decision.
Dave Bittner: Now, help me understand here. Let's talk about some what-ifs with this LinkedIn thing here.
Ben Yelin: All right. Lay it on me.
Dave Bittner: So I want to compare two different scenarios here. One, where I am browsing around on LinkedIn. And I do not have an account on LinkedIn. I'm just Joe Public off the street looking at LinkedIn profiles, right? So theoretically in that situation, I would not have agreed to any EULA. I would not have agreed in the process of signing up for a membership on this service, even if it's a free membership...
Ben Yelin: Right.
Dave Bittner: ...I have not agreed to anything. Right? Is that - or to what degree is that different from, if I sign up for the service, agree to a EULA that may include prohibiting me from scraping - right? - is there a difference there?
Ben Yelin: So this gets into what's a very complicated procedural history. Basically, there is a dispute as to whether LinkedIn and hiQ actually entered into some type of contractual agreement. And not to get really into the weeds, but there was some presentation at a conference where hiQ informed LinkedIn as to how they were using their service, and LinkedIn executives were in attendance. So in other words, LinkedIn should have known that even within the confines of the EULA that hiQ purportedly agreed to. There was sort of a general, if not formally signed, contractual agreement that this type of data scraping was taking place, if that makes sense.
Dave Bittner: Yeah, it does. I guess to extend this perhaps to the breaking point, what if I have a - what if I'm LinkedIn, and even before anybody can look at anything publicly facing, I have a pop up that comes up and says, hey, all this stuff's publicly available, but if you want to go any farther than this, the deal is no scraping?
Ben Yelin: That potentially is a little more complicated just because there would be that type of fair warning. Now, that might be what LinkedIn tries to do to get around this decision. If you look at the rationale of the decision, I don't think that would necessarily solve the problem on LinkedIn's end. And I think it's - and I hate to sound repetitive here, but the gate is still up. It's still publicly available whether you agree to the EULA or not. And that's the determining factor, at least as it comes to the Computer Fraud and Abuse Act. Now, there are other claims that could potentially be made. And we could see some common law contract claims, which is what LinkedIn tried to file against hiQ in California.
Ben Yelin: But ultimately, when it comes to crimes under the Computer Fraud and Abuse Act and preemption under the Computer Fraud and Abuse Act, so whether that federal law preempts any type of contractual dispute at the state level, that's going to be determined by whether the information is publicly available. And I don't think a pop-up EULA is going to make a difference in terms of that particular question.
Dave Bittner: Yeah.
Ben Yelin: That's how the court is seeing it. They needed a place to drive distinction, and they've drawn a distinction. And I think the distinction comes from their view of the purpose of the Computer Fraud and Abuse Act, which was to prevent hacking.
Dave Bittner: I see. So it's really a spirit-of-the-law ruling here.
Ben Yelin: Exactly. Now, this law was drafted in 1984. Computers, systems, networks were very different then than they are now. For people who don't like to rely on legislative history, I can completely understand why it would seem sort of ridiculous that there were, you know, this case was reading quotes from senators involved in that debate who have been dead now for 30 years.
Dave Bittner: Right.
Ben Yelin: Which is maybe, you know, more constructive than debating what our Founding Fathers said at the Constitutional Convention, but...
Dave Bittner: Only slightly so (laughter).
Ben Yelin: Exactly. But this is the distinction courts have tried to draw. And to be fair to the courts, they have to draw a distinction somewhere, because as we saw in that Van Buren case, if it's not get up, gate down, then you get into a situation where somebody is violating their employment contract by going on Facebook at work. Facebook is something you can access if you have access to the internet. There's no gate down, unless your employer or your organization has blocked Facebook and you go around that block or whatever.
Dave Bittner: Right.
Ben Yelin: If it's publicly available to you under a different interpretation of the Computer Fraud and Abuse Act, you could be criminally liable from going on Facebook at work because you're violating the terms of service set by your employer. And I think the Supreme Court in the Van Buren case said that that's just not a practical approach, and that doesn't comply with the spirit of the law in the first place. And I think the 9th Circuit is carrying on that spirit here, just in very different circumstances.
Dave Bittner: Yeah. All right. Well, I mean, I - seems to me like that's - in my mind, that's a good way to go. That's a good interpretation. I'm on board.
Ben Yelin: Yeah.
Dave Bittner: (Laughter).
Ben Yelin: I mean, I think you have to draw a line somewhere. And this is the easily - you want to make it something that's justiciable.
Dave Bittner: Right.
Ben Yelin: And this makes it very justiciable - a word that I always have trouble pronouncing.
Dave Bittner: Yeah. I'll let you say that.
Ben Yelin: It's one of those where if you were a district court or a trial court judge and you have a case similar to this, now you know exactly what to look for. Is there a gate up that prevents somebody from having access to this? Did they hack into it using some type of illegal means? Did they get onto somebody's online property that wasn't theirs that they did not have access to? Did they pick the lock, or was the door open, and they could just seen, and they're obtaining information from what they could see from that open door?
Dave Bittner: Right.
Ben Yelin: I think that's the distinction here.
Dave Bittner: Yeah. All right. It's interesting for sure. We'll have a link to that story from CyberScoop in the show notes. My story this week comes from The Washington Post. This is an article written by Drew Harwell. And this is about the Ukrainian military who, of course, are dealing with the Russian invasion. They are using facial recognition software to scan the faces of dead Russian soldiers, and then they're contacting their mothers.
Ben Yelin: Yeah. This is a very dark story.
Dave Bittner: Yeah.
Ben Yelin: But I think you capture it well. People are finding dead Russian soldiers in this armed conflict, taking pictures and using facial recognition from a company, Clearview AI, that we've covered extensively on this podcast. There's a very specific purpose why Ukrainian soldiers are doing this. And it's sort of the purpose of psychological warfare.
Dave Bittner: Right.
Ben Yelin: You can traumatize family members of dead soldiers by calling them and saying, using this facial recognition, we've realized that we recognize your son or your daughter on the battlefield, and they're dead. This is what's happening in this armed conflict. And the purpose of it is to weaken political support for the invasion among the Russian people. So it is a form of psychological warfare. What's interesting to me is that Clearview AI, under its founder and CEO, are being very supportive. And, in fact, they've held Zoom training sessions for members of the Ukrainian government and the Ukrainian armed forces on how to effectively use this facial recognition tool. So I think they see this as an important use of Clearview AI's capabilities. They see no moral or ethical issue of taking this action. Other people who commented for this article obviously saw it differently, saying basically, we might be sympathetic to the Ukrainian side in this conflict. But regardless of that fact, we need to be careful about using facial recognition to promote psychological warfare on the battlefield, that it could be a slippery slope, and there could be dangerous consequences.
Dave Bittner: It strikes me that - you know, soldiers have worn dog tags, you know, for decades, right?
Ben Yelin: Right.
Dave Bittner: So how is this really different from that? I could come upon of a dead soldier, read their dog tags, you know, look it up and call their mother based on that information - same effect in the end. I could also see - I could use Clearview AI, you know, through a pair of binoculars or a camera with a long lens, take a picture of a living soldier, call up the mother and say...
Ben Yelin: I got your soldier dead to rights here.
Dave Bittner: Yeah.
Ben Yelin: Yeah.
Dave Bittner: Or just say he's dead, even if he's not.
Ben Yelin: Right.
Dave Bittner: You know, same sort of thing here - I mean, does that - like, what do you think of either of those things that I just laid out there?
Ben Yelin: Well, the first scenario doesn't contemplate a situation where somebody's not wearing dog tags, maybe because they don't want themselves to be identified.
Dave Bittner: Yeah.
Ben Yelin: And I'm not an expert in Russian military operations.
Dave Bittner: Yeah.
Ben Yelin: And so I don't know how closely they're hewing to traditional war fighting practice of carrying these forms of identification.
Dave Bittner: Right.
Ben Yelin: So that's sort of the issue with your first hypothetical there.
Dave Bittner: Yeah.
Ben Yelin: I think the second one is a very serious issue. I mean, you could use psychological warfare in a way where you're not actually telling somebody the truth. And certainly what Clearview AI is doing here through this technology is allowing something - allowing a Ukrainian soldier to exploit facial recognition to cause undue psychological trauma to somebody's family.
Dave Bittner: Is that usually out of bounds in warfare? In other words, we're bringing noncombatants into this, who - these are civilians.
Ben Yelin: Well, we're not necessarily sure that they're civilians who are either taking the photos or...
Dave Bittner: And I mean the moms.
Ben Yelin: The moms, yeah. I'm not familiar enough with conventions of this type of warfare to know whether this is customary.
Dave Bittner: Yeah.
Ben Yelin: I do know that psychological warfare is traditionally disfavored, and - at least, under international laws of armed conflict, I think. And I don't think it's looked favorably upon to bring in noncombatants, people's families. That sort of is about minimizing civilian casualties and minimizing the war's impact on civilians. I don't know how much psychological - and probably some of our listeners know about this better than we do - whether that contemplates some type of psychological impact. So I'm not sure how out of bounds this practice is in terms of the rules of international conflict. There are certainly reasons to avoid psychological warfare, and I think some of the people quoted in this article are appropriately raising questions about that. It's not a healthy practice. It could further inflame tensions. It could further increase hostilities. It could have the opposite effect, as is intended. But I'm not sure how that plays into the laws of international armed conflict.
Dave Bittner: Yeah. Oh, and I wonder; how will it inform the laws of international armed conflict going forward?
Ben Yelin: Well, I mean, I think this is a new frontier. I mean, this is the first, really, on-the-ground, kinetic war that we've seen in the era where both sides have access to modern technological tools. We've been fighting armed conflicts over the past 20 years where there was an imbalance in the sides fighting these armed conflicts. We saw that in Afghanistan and Iraq and Syria and those types of places. But we're dealing with two relatively advanced countries here. I think maybe we underestimated the capabilities of the Ukrainian military, but we're seeing now that they actually, based on a lot of Western support, have a pretty advanced set of military capabilities. So this is something that is new to the world of international armed conflict. And so I'm wondering if the more we see this, the more this will make it into the type of international agreements we've entered into controlling the conduct of armed conflict.
Dave Bittner: Yeah. Yeah. It's fascinating, for sure. All right...
Ben Yelin: Very dark, but very fascinating.
Dave Bittner: Yeah, I know. It is. It's just to ponder, you know, where we might be going with all of this. Like you said, it's a different world. All right. We will have a link to that story in the show notes as well. We would love to hear from you. If you have a story you'd like us to consider for the show, you can email us. It's email@example.com.
Dave Bittner: And I recently had the pleasure of speaking with Paul Innella. He is CEO of a company called TDI. They recently published a white paper where they were discussing some of the risks that board members have these days in the constantly evolving world of online and cyber policy. Here's my conversation with Paul Innella.
Paul Innella: Yeah, it's gone from pretty much an absence of the discussion of cybersecurity, particularly as it relates to strategic objectives in board discussions, and all the way now to where getting closer to 50% of the Fortune 500 have cybersecurity as a strategic objective. Yet I still think we've got leagues to go before we're really there. You may be aware, but Senators Collins and King have been pushing for cyber requirements at the board level. And in the recent omnibus, it turns out there are some new requirements that will be imposed. I'm not entirely sure of the entire breadth of who's being governed by it, but we are seeing now, even at the legal level, some requirements for board-level understanding of cyber risk in organization. The problem that still really, really lies, regardless of governance or mandate, is the fact that the boards don't quite understand what the cyber teams are saying, and the cyber teams don't really understand how to talk to the board. And until that gets resolved, I think we have a real chasm to jump. And we've put a significant amount of work on how do you put together a body of knowledge that can bridge that chasm? But we do have quite a bit of work to get there.
Dave Bittner: To what degree have boards relied on, you know, kind of having that specialist that, you know, this is the person, the guy or gal on our board, who knows cyber? And so we've checked that box. Is that a thing that we saw happen?
Paul Innella: Not to enough of an extent. And really, I'd say in - only in some of the largest and most advanced organizations. You know, I think governmental advisory bodies are a pretty good example of where that does occur in the commercial space, in the in the private sector. You're not seeing it to any real extent other than people who may have dealt with it, board members who may have dealt with it in other organizations they've served. For the most part, particularly for the customers we work with, we have CISOs who are typically reporting up to the board and CEO almost at the same time, each and every time.
Paul Innella: But again, it's - there's a number of obstacles to success in that that we've identified in the past that really haven't been rectified. So you'll have still today CISOs who are copying and pasting from cyber tools and reporting out that last month we had 10,000 vulnerabilities, and this month we have 5,000 vulnerabilities. And we in this space understand that, contextually, that means nothing. There's no capacity to act upon it. It's not the same as talking about financial reporting at the board level.
Paul Innella: If I go - I serve on a number of boards. And if I'm sitting in Singapore or I'm sitting in Tokyo or here in Washington, D.C., and talking about the financial health of an organization and somebody slides to me a P&L statement or pro forma sales projections, we're all speaking the same language. We don't have that in cyber. If a board member says, I feel that we have a fiduciary and other responsibility to ensure the cyber health of our organization, show me that we're healthy, it's pretty rare that an organization can do so. And that's a big part of the problem.
Dave Bittner: Whose responsibility should it be to serve as that Rosetta Stone? You know, how much of this comes on getting your board up to speed with these terms? And how much of it is the CISO being able to speak to them in a language they understand?
Paul Innella: I think it's a combination of both. I also think it's going to require a big paradigm shift. You know, part of - we actually just wrote a white paper we're publishing this month on the topic. But part of the premise is cyber from an organizational perspective, meaning all of the C-suite and the board and shareholders have to buy in that cyber is as critical and interrelated to the success of an organization as any other resource or component - meaning cyber has to be elevated to the highest level of the organization so that its impact is known, it's understood, it's measured and visually reported and managed. Until that happens, you can't hang the chalice around the neck of a CISO and say, it's your job to fix this.
Paul Innella: No. 1, the budgets are out of whack, right? We, last year, spent 12.5% on cyber more than the year before, and it just continues to rise. That's $260 billion. You know how much it cost us in cybercrime last year? Six trillion. We're basically spending 4% against this against this 6 trillion nut, which means they're getting out of us 25 times what we're putting in. So now you want to ask a CISO, hey, report on our cyber health, but I'm not going to arm you to do so by neither elevating it to the level it needs to be or giving you the right budget to do so. It has to be a business-oriented shift. And until we look at how to increase the value of an organization based upon how we treat cyber and align cyber to the business objectives that the board can understand, it really is not going to change.
Dave Bittner: You know, we've seen discussion that there have been some shifts when it comes to the insurance companies and what they're willing to provide and the scrutiny that they're applying to organizations. When I've been on boards, you know, one of the key elements there has been that the board members have errors-and-omissions insurance to protect them. Is that an area where we could see some - you know, a positive force here, where the insurance companies are going to say, hey, we're not going to give you your E&O policy until you all are up to speed on this?
Paul Innella: Yeah, I do think it'll help. It's definitely a topic of conversation, in fact, based upon the omnibus structure around better reporting and requirements for board inclusivity of cyber expertise. The SEC just put out recommendations, and one of them was around ensuring board expertise exists. And part of that is - and it's not finished. But the recommendations are to provide some protection against liability for board members, with respect to bringing that expertise to the organization. So I do believe that providing some level of - some veil of protection is going to be critical to ensuring it. But I also believe that the onus doesn't have to be on the board to have that expertise. I think we can do a better job as organizations in articulating cyber health through a common lexicon and a common methodology where, when seen multiple times, as with anything else that a board addresses, it will become more second nature. It'll become trusted.
Dave Bittner: How much has to happen in terms of shifting the capabilities of the CISO? I'm thinking of communications. You know, generally, in my mind, most CISOs, you know, the primary thing they're not hired for is their ability to get up in front of a group of people and present, right? I mean, you know, there are those who are quite good at it, to their benefit. But is it possible that we see, you know, that the CISO has a communications specialist whose job is to be able to translate this stuff?
Paul Innella: Yeah, very much could be an answer to the solution. I would say if we were to arm organizations and the CISOs as the head of the cyber organization with the right framework for delivering an understanding of cyber health in an organization, that they would be better armed. CISOs are required to brief the boards pretty much uniformly. So whether they're doing it in a certain manner that's completely different from organization to organization or in a uniform way, doesn't really make a difference in terms of their ability to present. But what they're presenting can change.
Paul Innella: And I think that's where the big change in how organizations are handling cyber health and representing it needs to come. And I think there is a way there. We've been doing this for 20-plus years as an organization in intel community defense, civil and commercial companies around the globe. We've seen a lot from Fortune 10 boards to systems under the water and in space. And there is a complete and uniform way which we could be running cyber programs better that would make all of this coms flow better, and it wouldn't require a communications specialist. It would simply mean that we're all speaking the same language and working towards the same metrics.
Paul Innella: And that's the big - you know, we've kind of grouped, from an organizational perspective, all of these inherent problems into - we call them the organizational obstacles to cyber success. And it really boils down to a misalignment and understanding of risk tolerance between the board and the security teams. It's disparity in technical understanding and industry terminology, back to the lexicon. And it's a lack of meaningful and timely visibility into day-to-day cyber performance to kind of bring the whole thing home and make it abundantly clear. We're so heavily focused on activity. Am I compliant with X? How much should I spend on cyber? How many vulnerabilities do I - did I address? Activity is not achievement. What we are doing is not how well we are doing. And that's what has to change. And I think if you can drive a framework around making that change happen, all of these problems will start to go away.
Dave Bittner: Is - in your estimation, that's what it's going to take, by having a framework that everyone can agree on?
Paul Innella: I do. From a personal standpoint, I started TDI over 20 years ago, and we have been predominantly a cybersecurity services company the whole time. And five years ago, I'm looking at a market of 3 1/2 to 4 million openings soon - so a big workforce issue that we have - workforce gap that we have to close. And what we continue to notice as systemic inherent problems in organizations and decided that we need a tool that makes us more efficient, more automated and more economical if we're going to survive without the necessary workforce to do so. And so instead of building the 10,001st cyber tool that would be - provide endpoint protection or the next silver bullet, we thought, you know what we need? We need an extensible platform for running a cyber program.
Paul Innella: And so all of that, anecdotally, to say, yeah, I sure do believe in it 'cause I've spent millions as an organization building out a solution to address it. I think this is where the industry has to head. Every other engineering discipline has gone through its nascent stages and gotten to a point where they realize there has to be a discipline and rigor in how we conduct our craft. Software engineering went through this as an example. We don't really have that in cyber engineering. The next CISO you call will be telling you what fire they are putting out on any given day but not how they're aligning their organization to the corporate strategy. And that's the big hole.
Dave Bittner: All right, Ben, what do you think?
Ben Yelin: Really interesting. I mean, I think this is something that members of boards of directors hadn't paid enough attention to in the past. And as I think came crystal clear in your interview, they were focusing on the wrong things. They were focusing on box-checking and taking a certain finite number of actions in the name of cybersecurity rather than actually improving cybersecurity.
Dave Bittner: Right.
Ben Yelin: There's sort of the difference between going through requisite trainings, using best practices, using X number of tools, et cetera, versus actually solving the problem, which I think is an important distinction to make. And I think that it means that boards of directors are going to have to think more dynamically as we have a more robust threat landscape.
Dave Bittner: You know, we've talked a lot about how cyber insurance policies have changed a lot in the past year or so. You know, they've gotten more expensive. They cover fewer things. Deductibles are higher. I wonder if that's - or I wonder if to what degree that is seeping into errors and omissions policies for board members, right? Because I know a lot of - you know, you join a board, and one of the first things you ask is, you have E&O insurance, right? (Laughter).
Ben Yelin: Right. I learned in this interview, by the way, that you have been on boards of directors.
Dave Bittner: Yeah.
Ben Yelin: So you're intimately aware of these policies.
Dave Bittner: Yeah. Well, and you have to protect yourself 'cause you are on the line for those sorts of things. So it's - you know, it's standard operating procedure. And any board - any organization worth their salt is going to invest in errors and omissions for their board members or they're not going to get good board members, right?
Ben Yelin: Right. I mean, I think it's definitely going to seep in there 'cause it's just another risk of liability layered on top of the other risks inherent in running a company, an organization.
Dave Bittner: Yeah.
Ben Yelin: So yeah, it's going to have to be a part of that policy.
Dave Bittner: Yeah. All right. Well, our thanks to Paul Innella for joining us. We do appreciate him taking the time.
Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.