Caveat 4.22.20
Ep 25 | 4.22.20
You don't own your photos.
Transcript

Dave Bittner: Hey, everybody. Dave here with a quick request - if you could leave us a review on whatever platform it is you listen to this show, it'll help spread the word and grow our audience. So please take a few minutes and share why you think this podcast is a valuable part of your day. Thanks. Here's the show. 

Dmitri Alperovitch: The No. 1 responsibility of the U.S. government is to protect itself. Before U.S. government starts telling the private sector what it should do, it really should stop living in the glass house where its own cybersecurity is literally much worse than most private sector networks. 

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hi, Ben. 

Ben Yelin: Hi, Dave. 

Dave Bittner: On this week's show, I've got the story of a photographer who came up short in an online copyright claim. Ben wonders if the Supreme Court is going to take a look at the Computer Fraud and Abuse Act. And later in the show, my conversation with Dmitri Alperovitch - he is the co-founder and former CTO at CrowdStrike. And we'll be discussing the recently published Cybersecurity Solarium report. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be right back after a word from our sponsors. 

Dave Bittner: And now some reflections from our sponsors at KnowBe4. What's a policy? We've heard a few definitions. Some say it's an elaborate procedure designed to prevent a recurrence of a single nonrepeatable event. Others say it's a way that the suits play CYA. Still, others say it's whatever happens to reside in those binders the consultants left behind them right before they presented their bill. How 'bout a more positive approach? As KnowBe4 can tell you, better policy means better security, and getting the policies right is a big part of security. So is setting them up in ways that your people can actually follow them. We'll hear later in the show about how you might approach policy. 

Dave Bittner: And we are back. Ben, before we kick things off, we have a little bit of follow-up. 

Ben Yelin: Uh-oh. 

Dave Bittner: We got letters, Ben. We got lots and lots of letters. 

Ben Yelin: That can't be good. Letters are not good. 

Dave Bittner: Well, they were all - they were friendly, and they were kind letters. But many people wanted to point out that - a couple of weeks ago, we talked about the concept of force majeure. And you pointed out that, like many things in the legal system, force majeure is Latin. And our kind listeners let us know that - no, no, no, no, no, no, no, no. Or I should say - non, non, non, non, non. 

Ben Yelin: Non. 

Dave Bittner: Force majeure is French. 

Ben Yelin: Yeah. Let's just say I was I was testing all of you guys to make sure you were listening. 

Dave Bittner: (Laughter). 

Ben Yelin: How could a word like majeure be French? I just have no idea. 

Dave Bittner: How could it be anything but French? (Laughter). 

Ben Yelin: I know, yeah. I don't know where my head was. Just - let's blame it on the pandemic and a... 

Dave Bittner: I say... 

Ben Yelin: ...Lack of sleep. 

Dave Bittner: ...We'll move on. 

Ben Yelin: Yeah. 

Dave Bittner: You know, take your lumps and move on. 

Ben Yelin: Exactly. 

Dave Bittner: All right. Fair enough. 

Ben Yelin: But if you want to send hate mail, you know, to our lovely podcast... 

Dave Bittner: Yeah. I don't know. 

Ben Yelin: ...I'm happy to accept it. I will eat crow on this one. 

Dave Bittner: Yeah. Maybe all your students should get, you know, 10 bonus points for the professor making a mistake like this. 

Ben Yelin: They probably deserve it after this, yeah. 

Dave Bittner: (Laughter) All right. Well, thanks to everybody who wrote in. We do appreciate it. Let's move on to our stories this week. My story is an interesting one. This is a copyright story. A court ruled recently that Mashable can embed a professional photographer's photo without breaking copyright law. And this is all because of Instagram's terms of service. So this court in New York - this is a district court in New York - determined that a woman named Stephanie Sinclair - when she posted her photos on Instagram, that meant that other people could embed those photos because Instagram has an embedding function, like many things online. And basically, this court is saying that she gave up her rights to those photos for other people to embed them. She lost control over whether or not people could embed those photos or not. What's your take here, Ben? 

Ben Yelin: Yeah. So as soon as she decided to post the photograph on Instagram, she forfeited her copyright claim because the Instagram terms of services say that you are granting Instagram a sublicense to use the public content that you post for the users who share it. This decision from the district court came from Judge Kimba Wood. Those of you who are 1990s historians might remember that she was Bill Clinton's first attorney general nominee, I believe. So she's a pretty prominent judge. And the decision seems relatively simple to me. This is a terms of service issue. It applies specifically to Instagram. And it means that there can't be any copyright claims when another website embeds an Instagram photo. 

Ben Yelin: One interesting argument that this photographer made who initiated the lawsuit is that Instagram sort of owns the market for the sharing of photography. And you know, she's a photographer. If she wants to share her work and she wants to retain that intellectual property but she also wants to get a wide audience for her pictures, she's going to post them on Instagram, and she really doesn't have any other choice. That is the photo-sharing application. 

Ben Yelin: What the judge is saying here is, yes, I recognize that; yes, they have the largest market share, and I recognize as a judge that that's a very difficult decision. But this is ultimately the decision you made, madam plaintiff. You decided to post this photograph. And as soon as you did, you forfeited your copyright claim. So in the exact words of Judge Wood, the plaintiff made her choice. 

Dave Bittner: Yeah, this is interesting to me - I - a couple of reasons. It reminds me of back when YouTube was brand new and was just sort of getting up to speed. I think this was - what? - back in the '90s, I suppose, there was a lot of concern, a lot of gnashing of teeth and wringing of hands over whether or not, if you uploaded a video to YouTube, if you gave YouTube all the rights to your video. And they could use it for anything - they could sell it. And of course, these days it's impossible to imagine a company, for example, not uploading their corporate image video to YouTube. It's become a standard thing. And it seems as though many of those concerns didn't really play out. But this reminded me of that. 

Ben Yelin: Right. And I think Instagram now plays the same role in the photography market that YouTube played and the video-sharing market. In other words, for marketing purposes, whether you are a photographer like this plaintiff was or you are a company, you are going to need to post on Instagram to reach the largest number of eyes. And it's sort of a public policy problem that you will not have any intellectual property rights once you put that image on Instagram. 

Ben Yelin: If there were an enterprising company that came along and said, well, we're going to be the photo-sharing company that will protect your intellectual privacy rights; we're going to prohibit embeds; we're going to say that there is not a transferable sublicense to that photo, you know, maybe that company could increase its own market share and give a competitor to Instagram. Do I see that happening? Not really. I mean, Instagram is so prevalent in our lives. You know, most of us wake up and check our Instagram Stories. And we're sort of stuck with the sublicensing agreement that they have foisted upon us. And given their large market share, there's unfortunately not much we can do about it. 

Ben Yelin: And I think what makes this case interesting is we're talking about a professional photographer. So she doesn't retain the rights in her own professional work. I mean, imagine that in any other context. If I wrote some sort of policy paper and, you know, I had all these original ideas and it was so brilliant and as soon as I saved it as a PDF, then anybody in the entire world could embed it on their website without attributing it to me or, you know, giving me a copyright claim, I would not be very pleased with that. 

Dave Bittner: (Laughter) Well - and I wonder if Instagram could provide some more control here - if you could have a pro-level account. You know, maybe charge a fee or something where someone can share their photos on Instagram, but that's it. You can limit - you can choose whether or not embedding is active on some sort of basis per photo or per account or something like that and maybe pay for that privilege. 

Ben Yelin: Right, exactly. And we see those types of privileges on a bunch of different types of online services, you know? I think of LinkedIn Pro where I, of course, want to see who has viewed my LinkedIn profile, but I don't want other people to see that I have viewed their profiles. So I can pay for the privilege of knowing who's looked at my profile without sharing which profiles I've looked at. Perhaps Instagram could do something like that, where you get Instagram Pro and you can opt in or opt out to the embed function. I mean, I think that might be the most equitable solution here. I think what the court is saying is that's a solution that Instagram will have to come up with if it becomes a big enough problem for its user base or its customers, but that's not a problem that this court is going to solve. 

Dave Bittner: I see. 

Ben Yelin: So you know, I think it was sort of passing the buck down to the private organization in this case. 

Dave Bittner: Right. You made your choice when you signed up and agreed to the EULA, which we all read word by word, right? 

Ben Yelin: Absolutely, yeah. We all spend 10 hours reading those 600 pages of Instagram terms of service. And then we were so bored with it, we didn't even want to post any photos. But... 

Dave Bittner: Right, right. 

Ben Yelin: Exactly. 

Dave Bittner: Well, that's my story this week. What do you have for us, Ben? 

Ben Yelin: My story comes from the publication Reason, and it is from professor Orin Kerr. Frequent listeners of this podcast probably know I'm a bit of a fanboy for this professor, but he is one of the foremost digital privacy professors out there, so I take what he says very seriously. 

Ben Yelin: And this is about a potential Supreme Court case on the Computer Fraud and Abuse Act. This act was passed in the 1980s. It was a pre-internet piece of legislation. And the primary purpose for the act was to prevent the sort of hacking that existed at the time, which is obviously very different than the hacking that exists in the internet age. And there's been a long-running disagreement among courts as to what exactly the Computer Fraud and Abuse Act does? What does it mean by unauthorized access? 

Ben Yelin: So there's one strain that basically says, unauthorized access means somebody has stolen a password or gained unauthorized access through nefarious means. They've literally hacked into it or, you know, they've broken cryptography. That's sort of what one set of courts has interpreted, and that's more of an originalist interpretation of this law because if you look at the legislative history, that seems to be what Congress was trying to prevent. 

Ben Yelin: The other strain of thought from different judicial circuits is that it can be a violation of the Computer Fraud and Abuse Act if you violate a website's terms of service. So even if you didn't hack into the website, even if you didn't break their cryptography - you just used that site for a purpose that is not authorized according to the terms of service, then you could be prosecuted under the Computer Fraud and Abuse Act. 

Ben Yelin: So there's this case that has made its way through the court. It's actually, right now, in front of the Supreme Court. They're considering whether to grant certiorari, whether to hear the case. It is Van Buren v. United States. And the justices finally have the opportunity to resolve this circuit split. According to professor Kerr, the court is likely to hear this case. And we should find out in the next few days whether they've decided to hear it. 

Ben Yelin: Just a little bit of background on the case itself - so Mr. Van Buren - it's not the president, nor is it the Van Buren Boys of "Seinfeld" fame. 

Dave Bittner: (Laughter). 

Ben Yelin: He was a police sergeant. He ran a search through a police license plate database, but he didn't do so for, quote, "law enforcement purposes." He did so to search for a cash payment from an individual working as part of a police sting. So it was an unauthorized use of a database. He was charged under the Computer Fraud and Abuse Act, and he was convicted. This was appealed to the 11th Circuit. The 11th Circuit affirmed the conviction on the Computer Fraud and Abuse Act because it was the 11th Circuit, one of the judicial circuits across the country that has held this interpretation that any violation of the terms of service of any sort of database or any unauthorized use of a database that somebody already has permission to be in violates the Computer Fraud and Abuse Act. And professor Kerr seems to think that this is the perfect opportunity for the Supreme Court to resolve this question. So I hope they grant certiorari on this case. I think we do need some clarity because, as professor Kerr says, if the interpretation of the 11th Circuit holds and if there is this broad interpretation of the Computer Fraud and Abuse Act, then pretty much all of us have violated the Computer Fraud and Abuse Act. And I don't think... 

Dave Bittner: How so? 

Ben Yelin: So he talks about how it is a violation of Facebook's terms and services to falsify your location, which he did. And he admitted that in an affidavit to a court mostly to make an academic point that, look, if you take this very rigid interpretation of the Computer Fraud and Abuse Act where you are criminalizing mere violations of terms of service, which, as we just talked about in our previous segment, no one reads, then you're going to be criminalizing a lot of what we would probably deem to be normal online behavior. 

Ben Yelin: And what Kerr is saying and I think what others have said, including the Electronic Frontier Foundation in a brief they sent to the court, is we don't want the Computer Fraud and Abuse Act to just be a very general crime where you can arrest and charge people for doing bad things on the internet. It was created for a very specific purpose, and that was to prevent hacking into systems in an unauthorized manner. And so I think there's sort of broad agreement in the privacy and civil liberties community that this interpretation that the 11th Circuit has and that they had in this Van Buren case could lead to some very, very problematic results. And I think that's why it's useful for the Supreme Court to weigh in. 

Dave Bittner: Walk us through how this would play out if the Supreme Court decides to take it on. 

Ben Yelin: If they grant certiorari, there would be... 

Dave Bittner: Wait. Is that - is that a Latin term, Ben? 

Ben Yelin: I'm going to Google it just to make sure. 

Dave Bittner: (Laughter). 

Ben Yelin: No. I'm not actually going to Google it. That is a Latin term. 

Dave Bittner: (Laughter) Just checking. 

Ben Yelin: Although I am sure we're going to have users write in and be like, no, actually, it's Italian and I'll have to... 

Dave Bittner: It's ancient Greek (laughter). 

Ben Yelin: Yeah. I'll have to eat crow two weeks in a row. I don't know my classics, people. I'm sorry. 

Dave Bittner: OK. 

Ben Yelin: So if they grant cert, then they would most likely hold oral arguments in the next Supreme Court session, which begins in October. Depending on the state of the world, that would probably be a live hearing. But we just found out that the Supreme Court is conducting for the first time online oral arguments in the next couple of months due to the COVID epidemic, which now for the first time we've mentioned that on this podcast. 

Dave Bittner: (Laughter) We were so close to having an entire episode... 

Ben Yelin: We were so close to making it through. 

Dave Bittner: ...Without mentioning it (laughter). 

Ben Yelin: I know. We lose our prize. 

Dave Bittner: Oh, well. 

Ben Yelin: So, yeah. They would have the oral argument and the fall and they would come to a decision that would have nationwide applicability sometime, you know, I would guess probably early in the next year. But we're probably at least, you know, seven or eight months from having any sort of resolution on this issue. And until we do, there are different rules depending on which, you know, judicial circuit you happen to reside in. Here in Maryland, we are in the 4th Circuit Court of Appeals. If you happen to be in the 11th Circuit, that's where you would be subject to this interpretation. And that applies to Mr. Van Buren. 

Dave Bittner: Now, from a legal point of view, help me understand here when the Supreme Court makes a decision, how does that affect the decisions that the lower courts have made along the way? Is it retroactive if they decide that they don't go along with decisions other courts had made? Do those cases get revisited? 

Ben Yelin: Generally, there's what's called a good faith exception. If law enforcement were following the prevailing law at the time when they charge somebody or make an arrest, then the Supreme Court would generally not retroactively vacate those charges. Now, as it applies to Mr. Van Buren himself, he would be the party in the case. So it would have the effect of vacating his conviction, and he would no longer be subject to whatever penalties he would suffer as a result of that violation. It would also mean that in the future, no matter which judicial circuit you were in, you could only be charged under the Computer Fraud and Abuse Act or depending on what the Supreme Court said. If they decided that it had to be a traditional hack in order for there to be an offense under the CFAA, then that would be the rule for all future cases going forward, notwithstanding which judicial circuit you happen to reside in. 

Dave Bittner: All right. Well, we will stay tuned on that one - certainly interesting developments. 

Ben Yelin: Yes. Yep. 

Dave Bittner: All right. We don't have a Listener on the Line this week, but we would love to hear your questions. If you have one, you can call in at 410-618-3720. That's 410-618-3720. You can also email us your question to caveat@thecyberwire.com. Coming up next - my conversation with Dmitri Alperovitch. He is the co-founder and former CTO at CrowdStrike, and we will be discussing the recently published Cybersecurity Solarium report. 

Dave Bittner: But first a word from our sponsors. And now we return to our sponsor's point about policy. KnowBe4 will tell you that where there are humans cooperating to get work done, there you need a common set of ground rules to ensure that the mission is accomplished but in the right way. That's the role of policy. KnowBe4's deep understanding of the human dimension of security can help you develop the right policies and help you train your people to follow them. But there's always a question of showing that your policies are not only sound but that they're also distributed, posted and implemented. That's where the policy management module of their KCM Platform comes in. It will enable your organization to automate its policy management workflows in a way that's clear, consistent and effective. Not only that, KCM does the job at half the cost in half the time. It's your policy after all. Implement it in a user-friendly, frictionless way. Go to kb4.com/kcm and check out their innovative GRC Platform. That's kb4.com/kcm. And we thank KnowBe4 for sponsoring our show. 

Dave Bittner: And we are back. Ben, I recently had the pleasure of a very interesting conversation with Dmitri Alperovitch. He is the co-founder and former CTO at CrowdStrike. He recently left CrowdStrike, and he started a nonpartisan, nonprofit policy accelerator. And the focus of our conversation is the recently published Cybersecurity Solarium report. Here's my conversation with Dmitri Alperovitch. 

Dmitri Alperovitch: Well, the most important thing about the Solarium Commission and why it's different from so many other commissions that we have seen in the space - probably dozens upon dozens over the years - is that this was a commission that was organized by Congress, involving congressional members and, most importantly, doing its work for Congress itself. And many of the very thorny issues that we face in the policy space in cyber actually do require legislative fixes. And that's why it's so important to have had Congress' buy in and indeed their interest in this from the very beginning. And that is why, in many ways, the other commissions have failed because they were not really including congressional members from the get-go as the Solarium Commission has done and not focused on specific legislative language and proposals as the staff is now working on with members of Congress. So that's what makes it very different. Some of the recommendations, of course, have been around for years. The commission members themselves have publicly stated that their goal was not necessarily to come up with radically new ideas but to really survey the landscape and figure out what was out there that was useful to leverage in order to move the ball forward. There are a number of key concepts that I think are actually pretty important that the commission has highlighted. 

Dmitri Alperovitch: And the way I think about the problem of cybersecurity at least on the defensive side is really in three different areas. The first area is how do we protect the government itself? Certainly, the civilian.gov (ph) networks. And in fact, this is an area that gets often the least amount of attention in these reports and congressional oversight hearings, which is a huge mistake. You know, from my perspective, the No. 1 responsibility of the U.S. government is to protect itself. Before the U.S. government starts telling the private sector what it should do, it really should stop living in the glass house where it's own cybersecurity is literally much worse than most private sector networks. And we need to take care of that first and foremost. I think the vision for solving that is - at least the one that I have - a pretty ambitious one. 

Dmitri Alperovitch: And I understand why the commission couldn't adopt that wholesale. But the thing that they came up with, which I really like, is the idea that you strengthen the power of CISA, this new agency that was formed in the last legislation that came about about a year ago, and giving it actually the authority to hunt continuously across .gov networks, which will be a significant step forward in, one, clarifying their responsibilities because right now, CISA is a cybersecurity agency that is actually not for the most part doing the cybersecurity of the U.S. government. It is doing it from a sort of an advisory perspective but not having actual operational responsibility. And I think moving it towards having some operational skin in the game is a good thing for both CISA and for the rest of the federal government. So I think that recommendation is a really, really good one because it will actually give us the ability to understand which adversaries are now on those networks, give us a sense of how they're getting in so we can learn from that experience and focus on, first and foremost, kicking them out and, secondly, on how do we shore up those networks to make it more difficult for them to come back? So really loving that recommendation. 

Dmitri Alperovitch: There's a second one that's sort of a corollary to that, which is empowering of the Defense Department and more specifically Cyber Command to continuously hunt across the .mil networks, the U.S. military networks. Similar principle where we need to understand which threat actors are already inside our sensitive networks and giving really the authority to Cyber Command to bypass sort of the entrenched bureaucracy of different combatant command services and others to really empower them to continuously hunt in those networks. I think both of those agencies see some civilian side and U.S. Cyber Command on the military side will need to evolve further over the years where they would actually need to be given responsibility to provide operational security for those networks, their respective networks, beyond just hunting. So being actually responsible for protecting them going forward, that's a huge change and a huge responsibility that will take many, many years to implement. That's the best model effectively that we're starting to see in other countries where there is a centralization of cybersecurity expertise within the government and responsibility and authority as well. So those were the two that were focused on the government. 

Dmitri Alperovitch: On the private sector side, I'm actually a contrarian where I believe that we actually now know how to do cybersecurity well. And some of the best companies out there surely the big platform companies, the Googles of the world, the Microsofts and many others, are doing really, really well in defending their networks every single day against the most sophisticated adversary nation-states and organized criminal groups that are trying to infiltrate those networks. So the issue is not necessarily about sort of having Manhattan projects, if you will, of what do we do and how do we come up with new capabilities. Capabilities exist. The really good companies are well ahead of everyone else and understanding how to leverage them and operationalizing them. What we need to do is to make sure that everyone else, at least in terms of our critical infrastructure and other important businesses, are going to be on the same level. 

Dmitri Alperovitch: And for that, I do think we need some regulation. I think it needs to be lightweight and nonprescriptive. I'm not a fan, particularly in this area, of regulations that are overly prescriptive of, you shall patch; you shall adapt two-factor or any of those recommendations, which on the face of it may seem sensible but actually may be the wrong thing to do for a particular business. You know, for example, if you're in an industrial control systems space, patching may be literally the worst thing for you to do because patches have literally taken down more infrastructure than any piece of malware in that space. So you want to be really, really careful about how you patch. It doesn't mean you shouldn't, but it means that you should not be rushing to do it either. 

Dmitri Alperovitch: And whenever you're doing something, whether it's patching or whether it's two-factor, it - by definition, because resources are constrained, means you're not doing something else. And I would much rather have companies figure out themselves - what is the order of priority that they need to have to apply some of these recommendations versus having the government dictate for them without the knowledge of their specific operational requirements, resourcing requirements and so forth? 

Dmitri Alperovitch: But what I think the regulations should do is have accountability be placed on the boards of these companies and their CEOs to take cybersecurity seriously, to hold their own internal security teams accountable and really focus on outcomes versus prescriptive plans that companies should implement. So one of the recommendations that I really, really loved in the commission was this 4.4.4 section which says that the commission recommends amending the Sarbanes-Oxley Act, which is the act that was put in place after the Enron scandals in the early 2000s to regulate public companies better. And they're recommending to amend that act to include some cybersecurity reporting requirements. 

Dmitri Alperovitch: And what I'd love to see - the section is a little vague on what it is that they actually want to have in the Sarbanes-Oxley Act. But I think the idea is right to at least start with public companies, which are regulated, of course, by the SEC is to tell them, start tracking certain metrics. The way I'd love to see it is that they track them internally without even having to report them to the government because if you report them to the government - as we've seen, government security is not the best and you don't necessarily want the government making it worse by having attackers break into government networks, steal that data and then figure out who's doing well or not and use that essentially as a target list. 

Dmitri Alperovitch: So I would love for companies to track that internally, focused on outcomes. One of the metrics that I've been talking about for years with boards of directors and companies and getting really, really good reception is what I call the 1-10-60 rule, which is one that measures speed of detections, sped of investigations, speed of response, where the best companies strive for detecting an intrusion, on average, in one minute, investigating in 10 minutes and then responding and remediating it in one hour. 

Dmitri Alperovitch: Not everyone necessarily needs to be at that level. So you know, people shouldn't be focusing on the numbers themselves as much as focusing on the model of tracking their speed of detection, speed of investigation, speed of response and trying to optimize that. And if you're tracking that on a quarterly basis - if you're setting goals around those types of metrics and reporting them to the board, then the board has visibility on how well you're actually doing and how fast you are at detecting threats and responding to them. 

Dmitri Alperovitch: And the nice thing about requiring companies - at least public companies - to track those internal outcome-driven metrics is that if there is a breach - if there is a consequential event, inevitably nowadays, you have lawsuits that are launched around those types of events. And in the event of a lawsuit, those metrics will be discoverable by the other side, and they'll be able to take a look at them and say, well, wait a second - they continuously miss on their own internally set goals around those metrics. Was the board aware on a quarterly basis and still did nothing in response? - so there's a clear case of negligence here. Or perhaps they set goals that were too lenient compared to what the rest of the industry was setting. And again, the board was negligent. 

Dmitri Alperovitch: So it's a way to get the board to focus, one, on outcomes - to hold them accountable through potential litigation that may result coming out of a breach and negligence claims - and get them to not necessarily focus on technical details of how they should be securing them but getting them to focus, again, on holding security teams accountable. Just like most members of the board are not experts in sales but they - every single one of them that I've met, certainly, understands whether their company made their quarterly earnings numbers or not - very, very simple and black-and-white sort of calculation of here's a number; did we make it, or did we not make it? - and then holding the company's leadership accountable for those results. We need the same sort of accountability in the cybersecurity space. 

Dmitri Alperovitch: And too often, when I've been in board meetings with huge public companies - Fortune 500 companies - and I've witnessed presentations by CSOs, oftentimes it's a laundry list of projects they're working on. And I'm looking at board members, and they're on their phones, not paying attention because their eyes are just glazing over over all the technical mumbo jumbo. And it's just not productive. Instead, they should be focused on talking about strategy and talking about the outcomes that they're achieving and what goals they're setting for themselves. 

Dave Bittner: Now, in terms of urgency here, what do you suppose is a realistic timeline for rolling out some of these policies? 

Dmitri Alperovitch: As I mentioned, the great thing here is that the commission included members of Congress but, more importantly, the two co-chairs of the commission - Mike Gallagher, who's a phenomenal congressman on the House side from Wisconsin, and Senator Angus King, also very well-versed in those issues on the Senate side - as the two co-chairs, they're now sponsoring legislation this year. Now we'll see what happens with the COVID response. That may slow things down. But they are very anxious to take some of those recommendations, not the whole laundry list of hundreds that - or almost 100 that the commission put together, but the most important ones from their perspective and pushing them forward. 

Dmitri Alperovitch: One of the things that they are well aware of is that some of those recommendations can easily slide into what's called the NDAA, the National Defense Authorization Act, which has to pass every single year which funds the military. So there is no way that Congress can't not pass that legislation. So if they can slide some of those recommendations into the NDAA, we can actually get a bill this year and take effect in the new fiscal year. 

Dmitri Alperovitch: You know, we'll have to see how the pandemic crisis actually impacts this. The nice thing about cyber is that it is still a relatively bipartisan issue. And almost every year, if you look at the last five or six years, we've had some sort of cyber bill passed in Congress, which may surprise people given that nothing else really has much of a chance to pass. But on these issues, there is a great deal of urgency; there is a great deal of willingness on both Republicans and Democrats to do something. And the great thing about the commission is that you had staff literally not just write these high-level recommendations but they're now working on specific legislative language, working with lawyers, that they can give to staff in Congress and say, here you go, slide this into the bill and let's get things moving. 

Dave Bittner: Were there any areas where you feel as though they missed the mark, where they came up short? 

Dmitri Alperovitch: Well, I think some of the recommendations, in terms of some of the bureaucratic changes that they recommend, that I think are probably not the right ones at this point in time. You know, creating new agencies and new bureaucratic positions I don't think necessarily will solve the problem and will simply slow us down just by the nature of the fact how long it takes to stand up (ph) a new bureaucratic organization and for it to find its legs and really get things moving. 

Dmitri Alperovitch: So I could see some of those recommendations sort of, if you were starting with a blank slate, would make sense. But we're not. We've got a very urgent problem that we need to solve right now. You sort of - you go to war with the army you have, as Donald Rumsfeld once said. And we've got to leverage the resources we have to try to address this as quickly as possible. Some of those recommendations may make sense five or 10 years from now as we improve that situation and can start looking at how should we be reorganized for the future, but I don't think they make a lot of sense right now. 

Dmitri Alperovitch: The one that I thought was really, really good and the one that they sort of highlighted at the very beginning was on election security, obviously, very timely this year. And they rightly called out the most urgent thing that we need to do in elections is making sure that there is a verifiable, auditable paper-based voting systems in every precinct, county and state in this country. And unfortunately, we still have states, including counties in battleground states, that don't have paper-based records in their voting systems. And if you want to ensure that voters have trust in the election systems, you just absolutely have to have that. And it was good to see that called out and specifically calling on the Election Assistance Commission to receive additional funding to support states and localities to purchase that equipment. 

Dave Bittner: All right. Interesting conversation. What do you think, Ben? 

Ben Yelin: So first of all, another huge get for "Caveat" podcast. And I want to thank Dmitri for participating in the interview. It was very interesting, and it was a good summary of the Solarium Commission Report. I would suggest that people read the executive summary of the report. It's 22 pages, which is long for an executive summary, but it does do a good job of delving into the issues that the commission looked at. 

Ben Yelin: He made a couple really interesting points. On the public sector side, I think having some sort of centralized agencies, including ones that already exists, like the cybersecurity and Infrastructure Security Agency, be coordinating agencies for cybersecurity purposes is extremely important. We can do that in a way where there's proper congressional oversight. But having a centralized entity, you know, is something that I think will improve coordination among federal government agencies and with the states. 

Ben Yelin: On the private sector, he seemed to say that - and maybe this is sort of contrary to public belief - but the private sector generally knows what it's doing in terms of protecting its own network. They're good at what they do. And we don't want any sort of federal entity to pass down to these private entities on tablets, you know, this is exactly what you should do - you should patch this; you should do that. Keeping the advice general and allowing those private entities to act with speed and agility based on their current circumstances, I think is something that is also crucially important. 

Ben Yelin: And then election security, which I'm glad you guys talked about, it's obviously a big problem this year - we're in a presidential election - particularly when we might have a lot of people who are voting by mail or voting absentee, you know, it can be difficult to keep a paper trail. That's why I'm glad he sort of mentioned that as a major area of concern. But it was it was a fascinating interview, and I definitely encourage people - I'm being realistic. I know you're not going to read the whole Cyberspace Solarium Mission Report unless you're nerds like us. But I think the executive summary is well worth your time. 

Dave Bittner: Yeah, yeah. Well, again, thanks to Dmitri for joining us. Interesting conversation and we really do appreciate him taking the time. And we appreciate all of you for listening to our show. 

Dave Bittner: Of course, we want to thank this week's sponsors KnowBe4. If you go to kb4.com/kcm, you can check out their innovative GRC platform. That's kb4.com/kcm. Request a demo and see how you can get audits done at half the cost in half the time. 

Dave Bittner: Our thanks to the University of Maryland's Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.