Caveat 11.4.21
Ep 101 | 11.4.21

Regulators are looking for easy solutions, but there are none.

Transcript

Boris Segalis: Regulators are looking for easy solutions, but there's no easy solution.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today Ben has the story of the Supreme Court declining to take up a case on FISA transparency. I share some BBC coverage of a location data vendor data breach. And later in the show, Boris Segalis from Goodwin discusses possible SEC proposals that would require companies to have standardized cybersecurity systems and to monitor their digital risks. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right. Ben, let's jump into our stories this week. Why don't you start off for us? 

Ben Yelin: So my story comes from The Washington Post. It's entitled "The Supreme Court Will Not Hear a Case Seeking More Transparency from the Secretive Surveillance Court." Of course, that is referring to our friends at the Foreign Intelligence Surveillance Court, or FISA. 

Ben Yelin: So the ACLU has filed suit against the United States, alleging that FISA lacks transparency. A lot of their decisions, including ones that have an outsized impact on our surveillance state, are never released to the public. Obviously, it makes sense that, you know, most cases, when they are decided, have to be secretive. Surveillance is... 

Dave Bittner: Right. You see, that's kind of their thing, right? 

Ben Yelin: That's kind of the point. Yeah. 

Dave Bittner: (Laughter) OK. 

Ben Yelin: Surveillance wouldn't be very effective if the people who were being surveilled were made aware that they were being surveilled. 

Dave Bittner: Right. 

Ben Yelin: So this is more about post-hoc transparency, whether opinions that have significant impact on policy are released, you know, several years down the line after they have been issued. So there are a couple of issues here. 

Ben Yelin: The first is in 2015, Congress passed a law saying that the executive branch should at least consider releasing FISA court decisions that have a significant policy impact, where the public should have a right to know on which side the FISA court has come down. And then there are a bunch of decisions prior to 2015 that have still not been released but are still in effect. I mean, FISA, as far as we know, hasn't reversed most of its pre-2013 decisions, including on some of the major bulk electronic surveillance questions like Section 702 of the FISA Amendments Act and other programmatic surveillance. 

Dave Bittner: OK. 

Ben Yelin: So the ACLU took this suit all the way to the United States Supreme Court, and the Supreme Court just yesterday announced that it was declining to take the case. Now, generally, you need four out of the nine justices to agree to take a case in order for it to be heard. And the majority of justices, presumably who didn't want to hear this case, didn't give their reasoning. And generally they don't. You don't have to give a reason why you don't accept the case. It kind of leaves the public in a lurch, not really knowing, you know, why the case didn't get its day in court. But... 

Dave Bittner: Is there a historical reason or justification for that? Or, you know - folks in your line of work, is there an understanding as to why that's the standard for the Supreme Court? 

Ben Yelin: Basically, there haven't been oral arguments or briefs filed on either side of the equation. So I think in the view of the court, it would be unfair for them to release a long, lengthy opinion as a matter of custom without going through those procedural steps. 

Ben Yelin: Now, they sometimes release their reasoning, anyway, especially on contentious, hot-button issues. And very frequently, we get what are these dissents from the denial of certiorari. And that's what we got here. So if you have justices who would have taken the case, they'll release their own opinion about why they would have taken the case. And we have quite the odd couple here. 

Dave Bittner: (Laughter). 

Ben Yelin: Obama appointee and liberal Justice Sonia Sotomayor and Trump appointee and very conservative Justice Neil Gorsuch jointly wrote an opinion dissenting from the denial of certiorari, the court's denial to hear the case on the merits here. And they argue that, you know, there is such an inherent First Amendment value in transparency. The public can't make public policy decisions in terms of, you know, the elected officials that it elects and supports if they're not fully aware of the implications of those decisions. And to have these decisions on especially, you know, bold programmatic surveillance be so secretive, you know, that cuts against transparency. That cuts against the public's right to know what their government is doing. 

Ben Yelin: Obviously, there are going to be times where the government is justified in keeping those opinions secretive. But I think, from what Justices Gorsuch and Sotomayor are saying, that should not be the custom. There should be a default towards transparency. 

Ben Yelin: I think what's even more disturbing to them is the United States government argues that the court should not be in any position itself to make a decision as to whether an opinion is declassified. That should be solely within the province of the executive branch. 

Ben Yelin: That interpretation, I think, in their view, would be particularly dangerous because the executive branch could decide on its own, without any judicial review, never to release any FISA opinion, including ones that have a major impact on public policy or on Fourth Amendment jurisprudence or even First Amendment jurisprudence. 

Ben Yelin: And, you know, I think that certainly rubbed Justices Gorsuch and Sotomayor the wrong way. They claim, quote, "the extraordinary claim that this court is powerless to review lower court decisions, even if they are mistaken, just cuts against the notions of fairness and transparency." He says, on the government's view, literally no court in this country has the power to decide whether citizens possess a First Amendment right of access to the work of our national security courts. And in his view, if these matters are not worthy of our time, then what is? 

Ben Yelin: I mean, we're dealing with such fundamental issues of constitutional rights, of transparency, of the government's power to listen in to our phone conversations, to collect our stored communications, our electronic communications, that, you know, to say that courts would have no role in determining whether these decisions could be declassified is particularly dangerous. So even if, you know, ultimately the court came down on the side of, the government has a valid reason not to release this opinion, the court should at least have a role in the process, if that makes sense. 

Dave Bittner: So where - what happens now? I mean, does that leave - decisions about transparency, does that leave that in the lap of Congress, then? 

Ben Yelin: I mean, Congress - yes. So Congress would have to pass a law that expands the USA Freedom Act of 2015 to require the redaction of FISA opinions in a greater number of circumstances. So right now, the 2015 law only requires the government to review whether they should release any opinions for public release. What Congress could do is step in and say after, you know, a given statutory time period, maybe three to five years, an opinion has to be released to the public unless there are, you know, significant national security concerns. 

Dave Bittner: Right - so have some sort of sun setting on it with the caveat that if it needs to be extended, it can be. But the default is that it gets shared. 

Ben Yelin: Right. I mean, that's how it works with presidential records. You know, if the Kennedy assassination papers are going to be redacted for 50 years and then finally, in 50 years, by statute, we'll release them unless the government has some sort of compelling reason to keep the information secretive... 

Dave Bittner: Right. 

Ben Yelin: Hint, hint to our conspiracy theorists out there. 

Dave Bittner: (Laughter) The grassy knoll. 

Ben Yelin: Exactly. So, yeah, I mean, that's the role Congress could play in here. Do I think Congress is going to step in and take this step? I'm not sure. I mean, I don't think they are in as much of a surveillance skeptic mode as they were in 2015. That was two years after the Snowden leaks. So there was sort of a political mood against the surveillance state. 

Ben Yelin: I will say, you know, after the Carter Page FISA warrant and that scandal and the Horowitz report, you know, I think there are certainly some conservative skepticism of the power of the FISA court. So you might see more bipartisan support for a measure to require transparency. But, yeah, I mean, in terms of the path of this case through our court system, it's over. You're going to have to get additional justices who agree with Sotomayor and Gorsuch in order to have this case heard at the Supreme Court, again, unless Congress decides to step in. 

Dave Bittner: Could you help me understand the machinations of this case making its way to the Supreme Court? I mean, this article here in the Post says the justices turned down a request from the ACLU and others to review a ruling. Explain to me how that works. How does something like this get put in front of the Supreme Court, even for their consideration? 

Ben Yelin: Great question. So generally, cases in front of the Supreme Court come from two places. The majority of them come from federal circuit courts - so federal courts of appeal - courts of appeals. The losing party in that case can petition the Supreme Court for certiorari. And the Supreme Court can either decide to take up that case or not take up that case. 

Dave Bittner: Now, let's just pause for a second there. You used a fancy word. What is that word - that fancy word you used? 

Ben Yelin: Certiorari? 

Dave Bittner: Yes. Please stop (laughter). 

Ben Yelin: It's some Latin mumbo-jumbo. It basically means giving the case its day in court... 

Dave Bittner: Got it. 

Ben Yelin: ...Or agreeing to hear the case. 

Dave Bittner: OK. Good. 

Ben Yelin: At least I believe it's Latin. I screwed that up in the past... 

Dave Bittner: Yeah. 

Ben Yelin: ...When I said it was Latin, and it wasn't. And we got complaints. 

Dave Bittner: Boy, did we hear from our listeners. 

Ben Yelin: We sure did. 

Dave Bittner: (Laughter). 

Ben Yelin: And I definitely deserved it that time. 

Dave Bittner: OK. 

Ben Yelin: Cases can also make their way from state supreme courts. So if it's a state case that deals with a federal constitutional issue or, in very rare circumstances - Bush v. Gore comes to mind - dealing with a state constitutional issue or a state statutory issue, those cases can also make it to the Supreme Court. But the Supreme Court has discretion as to which cases it hears. There are thousands and thousands of petitions to the Supreme Court to get cases heard every year, and they generally hear somewhere in the neighborhood of about 100 cases per year. So that's a small fraction of those cases. 

Ben Yelin: They have what's called a cert pool. So a bunch of law clerks review all of the applications to get cases heard in front of the Supreme Court. So those are clerks from all nine justices. Some of them are easy denials. You know, there just might not be a valid constitutional claim that's worthy of the court's time. And that's what the clerks would suss out. They'd go through and say, all right, 85% of these, put them in the shredder. No - you know, no use for our time. 

Dave Bittner: Right. 

Ben Yelin: The rest of the ones where there is a close question, those will go in front of a conference of Supreme Court justices. And they'll decide - using this informal rule of four, which is a, you know, minority of the members of the court, not a majority, they will decide whether to hear the case. 

Ben Yelin: You know, I think there's been criticism of what's called the shadow docket, where the Supreme Court makes decisions as to what cases they're going to take up kind of in the dead of night. I mean, they're not real - unlike decisions, which are released in a high-profile way at the end of a term, these decisions sometimes, you know, come on a Friday night. It's a news dump. You know, the majority of justices don't have to issue any comment as to why they decided to deny the case. So oftentimes, we have to kind of lean into these dissents to understand what the conflict was, what the real issues were at stake. 

Dave Bittner: And the Supreme Court is always reactive in this case. In other words, cases have to come to them. They don't - they never go out looking for things that interest them. 

Ben Yelin: I mean, the only way they can go out and look for things is if there are ripe circuit court cases that have been appealed to them. 

Dave Bittner: OK. 

Ben Yelin: Now... 

Dave Bittner: Right. 

Ben Yelin: In a very limited number of circumstances - and we're talking, you know, things that generally happen once in a blue moon - the Supreme Court will have original jurisdiction, and they can hear a case, even if there hasn't been a lower court decision that's been appealed. 

Dave Bittner: Is there an example of that? 

Ben Yelin: Well, there was some argument as to whether a - you know, the dispute between the state of Texas and the state of Pennsylvania during the election season last year, whether that was a case of original jurisdiction. That was one state suing another state. And some justices in that case believed that the Supreme Court has original jurisdiction when you have one state suing another state. But a majority of justices didn't come to that view. They rejected a hearing of the case. There were, I think, three justices who wrote separately to say, we believe this court has original jurisdiction and has to hear all disputes when you're talking about one state against another state. So that would be a potential example. But again, those are few and far between. 

Dave Bittner: Yeah. All right. Well, interesting stuff. Thank you for the very informative lesson on the functioning of the Supreme Court. I appreciate that. I hope our listeners do as well. 

Ben Yelin: I hope I didn't bore them. I probably did. 

Dave Bittner: No, no, not at all. I mean, I think it's - I think so much of this happens with the assumption that people just sort of get it. And I - you know, I think most of us don't. So to have someone like you who actually understands all this stuff explain it to us, I think that's quite helpful. 

Ben Yelin: Yeah. I mean, I think you get a lot of people - it's sort of like, you know, some of us who are not on the technical side just kind of nod along when... 

Dave Bittner: Right (laughter). Right. 

Ben Yelin: ...People are talking about... 

Dave Bittner: Right. Somebody who starts explaining how Wi-Fi works, and you're like, oh, yeah, that's... 

Ben Yelin: Packet switches? 

Dave Bittner: Interesting. 

Ben Yelin: Sure. Yeah. 

Dave Bittner: OK. Sure. 

Ben Yelin: That sounds like a thing. Yeah. 

Dave Bittner: Yeah, frequency spread spectrum - great. 

Ben Yelin: Yeah. 

Dave Bittner: (Laughter) All right. Well, good stuff. Let's move on to my story this week. My story comes from the BBC. This is written by Jane Wakefield, and it's titled "Location Data Collection Firm Admits Privacy Breach." And this is from a company called Huq. It's H-U-Q, and I hope I'm pronouncing that correctly. It's sort of a double... 

Ben Yelin: That sounds right. 

Dave Bittner: It's a double whammy because it's a U.K. company, and so they don't always pronounce things the same ways that we do. Also, let me just say... 

Ben Yelin: We fought a Revolutionary War on that, I believe. 

Dave Bittner: (Laughter) Right. Exactly. Right - over the correct pronunciation of aluminum. 

Ben Yelin: Exactly. 

Dave Bittner: So Huq is a firm that deals in the brokering of location data. They claim that they do all of this anonymously. Just, you know, a side note - if you - as you and I have spoken about quite often, particularly with location data, it is quite easy to de-anonymize location data. 

Ben Yelin: Sure is. Yep. 

Dave Bittner: But this company recently released news that two firms that they gathered data from had what they refer to as technical breaches of data privacy requirements. And these were fairly - the types of apps that were gathering your location data. So one was a Wi-Fi strength meter, and another scans barcodes. All right - so utilitarian apps. 

Ben Yelin: Sure. 

Dave Bittner: And I think we see this a lot where these utilitarian apps - they have one function. They have one job. 

Ben Yelin: You had one job, utilitarian app. 

Dave Bittner: Right. But behind the scenes, they're almost a Trojan horse to get on to your device to be able to report back information to these data brokers. 

Ben Yelin: Yeah. 

Dave Bittner: And in this case, they were sharing location data, despite users saying that they did not want their data shared. So the U.K.'s Information Commissioner Office is looking into this. Obviously, there's the possibility of a GDPR violation here. 

Ben Yelin: That's how it would apply to countries outside of the U.K. 

Dave Bittner: Yeah. 

Ben Yelin: But because of Brexit - right? 

Dave Bittner: Oh, right, right. 

Ben Yelin: Yeah. 

Dave Bittner: Yeah. That's interesting. Yeah. This article... 

Ben Yelin: I think there would have to be a separate cause of action within the U.K. from the U.K. Information Commissioner's Office. 

Dave Bittner: Right, right. That's an excellent point. Yeah. I hadn't considered that. Yeah. So they say that these apps have fixed the problem that they had, but I guess I just approached this with a heaping helping of skepticism where I don't think there's a whole lot of effort on the part of the data gathering companies to go through and audit the apps that are providing them with this data. And call it a hunch, Ben (laughter), but it's just - I suspect that's the side they come down on. What is your take here? 

Ben Yelin: Yeah. I mean, I think it's a hunch, but it's also a well-founded hunch. You know, they say that as soon as they were made aware of what they're calling technical breaches, they rectified their code. They republished the apps. Everything was taken care of. My question is, how often are they looking under the hood? So, you know, if you're a data broker firm and you're collecting location information from probably hundreds of different applications. 

Dave Bittner: Right. 

Ben Yelin: And each of those applications has their own EULA, you know? Are they only making these decisions when they are caught in a public forum and when they're facing possible reprimand through GDPR or through the relevant U.K. office? I think that's what I would worry about here, is - all of this happens after there's already been a recognized breach. 

Dave Bittner: Right. 

Ben Yelin: And I think that's, you know, something that consumers should be wary of. Just by, you know, the nature of data brokerage, companies are going to have incentive to collect as much information as possible, you know, because that's how they make their money. 

Dave Bittner: Yeah. 

Ben Yelin: You know, the bigger universe of location information, the more valuable it is. 

Dave Bittner: Yeah. 

Ben Yelin: So they don't want to get caught having cut corners or having collected location information where a user has specifically denied them permission to do so. 

Dave Bittner: Yeah. 

Ben Yelin: So you know, I think, while they're saying, you know, this one time, we took care of the problem, we fixed the code, download the newest version of the app, everything will be fine, I think it requires us to be more skeptical of these firms in the first place and realize that because of the incentive structure, I think you're going to run into these difficulties. 

Dave Bittner: Yeah. And I would add, you know, for our listeners' sake, that these single-use apps, you know - and I think the most notorious of all of them are flashlight apps, right? Like, press a button, turn on your flashlight. And... 

Ben Yelin: Yeah. You think that's pretty innocuous, right? 

Dave Bittner: Right. But historically, they gather all sorts of things behind the scenes and send them back. But these single-use apps - you know, I'm in the midst of doing something. I need something that's going to scan a barcode. I need something that's going to - you know, all sorts of little single functions. And it's easy to go to the App Store, find the thing that does that, download it and use it for what it's there for. It probably does a great job at the thing that it says it's going to do. But then the problem is, you forget about it. 

Ben Yelin: Yeah. 

Dave Bittner: And that app is sitting on your device now. And in the background, it's just doing its thing, gathering... 

Ben Yelin: Collecting all of that location information. Yep. 

Dave Bittner: Right. Now, to their credit, I mean, some of the mobile device providers - you know, Apple, for example - some of their recent releases, they track this. And it'll - you'll get a pop-up that says, hey, you know, the flashlight app has been requesting your location data here for... 

Ben Yelin: Why is the sandwich app that - you know, you just wanted a free sub six weeks ago. Why are they still tracking my location data? 

Dave Bittner: Right. Exactly. And so they point that out and make you aware of that activity. And they say, hey, do you want me to still allow that app to do that? 

Ben Yelin: Do you want to not do this? 

Dave Bittner: Right. But I would say it's worth being vigilant and taking it even to the next step, where when you're done with that app, just delete it. 

Ben Yelin: Yep. 

Dave Bittner: Delete the app. It's always going to be there on the App Store next time you need it. But if it's not something that you're using regularly - if it's a one-time use thing, when you're done with it, get rid of it. 

Ben Yelin: Yep. 

Dave Bittner: Don't just be a pack rat and hoard these things because you never know what they're doing behind the scenes. And chances are, it ain't good (laughter). 

Ben Yelin: Yeah. I mean, and they mentioned some of the types of applications for which location data is being collected. So flight tracking, weather, Muslim prayers. 

Dave Bittner: Right. 

Ben Yelin: Those are among the types of applications where information was being sent to this company. 

Dave Bittner: Yeah. 

Ben Yelin: So it's things that really - you know, all of us use at least some of those types of applications. 

Dave Bittner: Right. 

Ben Yelin: They're all on our devices. Something like flight tracking is a great example. I mean, unless you fly regularly... 

Dave Bittner: Right. 

Ben Yelin: ...Flight - or, you know, you're an aviation hobbyist. Flight tracking is something you only need once in a while. Download that flight tracking app. Figure out when, you know, your grandparents' flight is going to arrive so you can pick them up at the airport on time. Pick your grandparents up at the airport. Delete the app until the next time you need to pick somebody up. 

Dave Bittner: Yeah. 

Ben Yelin: I think that's the safe practice. That's the way to minimize your risk. 

Dave Bittner: Yeah. Absolutely. All right. Well, we will have a link to all of our stories in the show notes. Again, that one comes from the BBC. 

Dave Bittner: We would love to hear from you. If you have a topic you would like us to cover, you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Boris Segalis. He is from the Goodwin law firm. And we're discussing some possible proposals from the SEC that would require companies to standardize their cybersecurity systems and to monitor their digital risks. Here's my conversation with Boris Segalis. 

Boris Segalis: I think that this is not really about the SEC in particular. There's been a lot of focus on cybersecurity in kind of the hunt for solutions on how to deal with it across industries and as a nation. And in this push - and we can talk about kind of whether it's going to work or not - different agencies, including the SEC, have become more active in kind of trying to find their own way to get the industries that they regulate to be better about cybersecurity. 

Boris Segalis: So the SEC has its own proposals that New York Department of Financial Services has cybersecurity regulations. They've become more active in enforcement. And I think we just saw news from the DOJ that they're going to ramp up enforcement of cybersecurity through holding companies responsible for not reporting breaches or providing defective cybersecurity products and services. So you've got this effort to address cybersecurity from all angles, and every agency is taking steps to do it. 

Boris Segalis: And I think from the SEC's perspective, when they are looking at broker dealers, for example - right? - they just had an enforcement action. I think it was six or eight companies, right? And they found that, I think, six of the eight - I don't have the numbers in front of me, but had policies and procedures that called for something - like, say, called for multifactor authentication. But it wasn't implemented at all. And others didn't - so they didn't follow their own procedures, and others didn't take appropriate response to vulnerabilities or didn't know about vulnerabilities or provided misleading notifications. That's what the SEC alleged. 

Boris Segalis: Now, they're then trying to take steps to deal with this problem. But you know, on many levels, I think that the efforts are probably misguided and won't lead to better cybersecurity. 

Dave Bittner: Why do you take that position? What do you suppose is misguided about them? 

Boris Segalis: So let's say, if we take the SEC in particular, and - in this enforcement action, they found that a number of companies had required MFA - multifactor authentication - but they weren't really using it. You'll say - you know, if you're an outside observer, you'll say, wow, that's, like, a big deal. These companies really screwed up. But if you get a sense of - if you're in kind of my space, and you see how - if you focus on privacy and cybersecurity and you see a lot of companies across different industries, you would recognize that the maturity level of cybersecurity across many, many, many U.S. business sectors is low. 

Boris Segalis: There are many reasons why it's low - not understanding the regulations, not having the resources, not understanding how to direct resources. So there's nothing unusual about what the SEC found and probably nothing that alarming. But to an outside observer, it's a bigger deal. You've got, obviously - in the news, it's clear that we've got a cybersecurity problem as a nation. And so how do you approach it? You know, this is where it's important to separate governance from politics. 

Boris Segalis: Let's say - to the pirate problem off the coast of Africa - right? - that existed for some years. The solution to that was to, let's say, work with the ship owners, give them new routes, make sure they're educated about the threat, and then send the navies and patrol the waters to act as a deterrent to the pirates. 

Boris Segalis: Another solution would have been, or I could have proposed, is to say, like - let's just say, like, OK, well, these ships are - you know, they're sailing through the coast of Africa - by the coast of Africa. They're doing it - you know, let's just tell them that they need to buy their own weapons. They just need to deal with that problem on their own. You know, maybe we'll share some intelligence with them. But pretty much otherwise they're on their own. And guess what. If they get hijacked and the ship is taken, the liability is really on the operator for not having had enough guns or the right kind of guns or whatever to prevent the hijacking. 

Boris Segalis: Then you would have said, that's a crazy idea, right? But that's what the regulators are kind of suggesting. And where we look - I mean, they're saying, let's keep - let's hold the companies responsible or give them some kind of baseline that's sometimes arbitrary, like using MFA to improve security. But using multifactor authentication is maybe only good for today for a particular threat. And guess what. When you go on your investment account or whatever account you visit during the day on your computer, you want the convenience of having to access it and not constantly trying to, like, log in to your services and do multifactor authentication. So there's - you know, this - unless it's something everyone is doing and creating that inconvenience for consumers, consumers don't like it. 

Boris Segalis: So this idea that we can shift this responsibility to companies that are, you know, struggling with this because there's no clear solution and create kind of arbitrary, low-hanging fruit, like multifactor authentication or this or that or reliability, is just not going to work. It hasn't worked. It's not going to work. 

Boris Segalis: I mean, the solution is really - it's a war. It's a cyber war. And you need an army - right? - collaboration, working together, giving companies immunity, education, resources - to make it go away. And today, what we're seeing in the news constantly is, like, more rules and more enforcement. So that's - maybe that's not a conventional view. But I doubt that this approach is actually going to work because - from what I see in my practice. 

Dave Bittner: You know, I wonder if a comparison to something like, you know, public health is useful, where - you know, you talk about something like multifactor authentication. I mean, is that the cyber equivalent of expecting a doctor to wash their hands? You know, and if they - there are basic standards of care, and we talk about cyber hygiene. How unreasonable is it to expect certain baselines of protecting people's data? 

Boris Segalis: So I think it's established scientifically that a doctor washing their hands is - there have been probably a million research papers written that that really reduces the risk by, like, 90% of disease, right? 

Boris Segalis: I mean, but multifactor authentication - I mean, we take that, but other elements are fleeting safeguards, right? I mean, they are good for something maybe that exists today, but hackers change their methods every different day. And the type of attack that's at issue also changes every day. I mean, it's always - it's evolving. It's evolving. 

Boris Segalis: So certainly - I mean, I'm not advocating against companies not using multifactor authentication. It's not always feasible. It's not always proportionate to the inconvenience to consumers, or sometimes it's detrimental to the business. 

Boris Segalis: Let's say you have a service - and this is maybe not the SEC space, right? But you have a service that allows low-income workers access to financial services like money transfers or getting their pay on a card or bank card. It requires an account. And let's say research establishes that most of them don't have an email address, and maybe they use a flip phone. They don't have a smartphone. So you can't implement multifactor authentication really in a convenient way because that will deny them access to a particular service. 

Boris Segalis: Now, that's like a real example. And that's one example. But that is to say that these - the security measures, generally speaking - I think those regulations actually have it right - where the security measures that companies need to take have to be proportionate to their risk and proportionate to the situations. 

Boris Segalis: But even if they do that, there's no guarantee or potential benefit from withstanding an attack. I mean, things at work are sometimes like endpoint monitoring software, like knowing early that an attack occurred, things like that. You know, I've seen that having an impact. Or, you know, backing up your data for - as a safeguard against a ransomware attack, even though that's also a limited utility. It's a morass. I mean, it's like the regulators are looking for easy solutions, but there's no easy solution. 

Boris Segalis: And it's very frustrating to see it because for years, I think it's been clear that the answer is to find a way for the government and the companies to work together. And in fact, the government has always asked the companies - like, whenever I've been in presentations, you know, you've got DOJ and others asking companies to collaborate more on incident investigations because, you know, the government has the ability to go after these hackers. 

Boris Segalis: Now, never mind that companies don't really care who the hackers are. They just want to deal with the business interruption and legal liability problem. But the impediment to that has always been companies feeling that if they have an incident, they don't want to get the government involved because they don't want the incident to become public because they're afraid of litigation. 

Boris Segalis: And companies across industries - let's make it clear - they're victims of these hackers. Even if they didn't have MFA or didn't follow their own procedures, they're still victims of these hackers. You know, cybersecurity is not the primary business of any of these companies, and they should be protected from breach litigation. 

Boris Segalis: I mean, that's part of the puzzle. And that's - maybe that's politically difficult. But without protections for companies to encourage collaboration and working together, like, across industries and with the government to share resources and knowledge freely to counter that threat, you know, if you kind of vacillate between scaring companies into liability and then telling them we have to work together, it's just not going to work. 

Dave Bittner: So where do you suppose a compromise could be on the reporting sides of things? Where, you know, companies are obligated on some level to share that a breach has occurred, is there a way that they can do that with the government and still not reveal some of the things that they're afraid to reveal? 

Boris Segalis: I mean, I think companies do report some limited information, but there's no full cooperation. I don't know that there's a middle ground. I mean, it's like the threat is an opportunity and a requirement for a new industry to emerge in the United States. It's a huge opportunity. And that industry is cyberthreat protection, but in a different way. 

Boris Segalis: What we have today is an industry focused on incident investigation, incident reporting. That's very low value add to the society. It's kind of - you know, it's great for lawyers. It's great of vendors who work in this space. But I don't think that being notified for a consumer, you know, 100 times about different breaches that implicate the same information does them any good. I mean, we've got to the point of breaches where your information has been stolen 10 times over. So you've got this kind of cottage industry built around these requirements. 

Boris Segalis: What discourages companies from - well, you know, you've got legal requirements to report. You've got legal requirements. You've got lawyers like myself who look at the legal requirements, and we advise companies whether to report or not report or how to report. In that process, we make all sorts of risk determinations because some of these requirements are - contain harm thresholds and other gray areas that allow these determinations. And, you know, companies try to do the right thing. 

Boris Segalis: But certainly, if we take the California law, CCPA, the CCPA introduced a private right of action for breaches in California. And, you know, what you've got is if you have more than 500 California residents affected, you have to report to the California AG. Once you do that, the notice becomes public. And once you do that, you know, there's a plaintiff's count - you know, plaintiff's bar waiting for those notices to file litigation. 

Boris Segalis: Is that helpful to the society? No. But this certainly becomes a major factor in how companies look at breach reporting. And there's going to be a gray area. And you're a lawyer, outside counsel or in-house counsel, representing a company. You're going to think about how to interpret that requirement in light of the possibility of that class action in other states that they follow. You know, there's nothing - there's no way around it. So that's counterproductive. That's - you know, that's something that discourages companies from doing more. 

Dave Bittner: To be fair, wouldn't that encourage the companies to put more effort into protecting the data at the outset to prevent the breach? 

Boris Segalis: Well, that assumes that it's somehow possible, that there's a formula for doing that. Companies are already doing that. I mean, I think that there's no - I've never met a company, let's say - and I've practiced in the privacy space exclusively since 2007 - that took the view, oh, we don't care about security. Let's just do whatever. I've never met a company like that. They always want to do the right thing within the resource that they have. 

Boris Segalis: So to the idea that you've got these companies that are kind of like - don't care about the security of information, it's not real. I mean, you've got some mom-and-pop shops, certainly, and maybe doctors or - you know, that simply don't have the resources to do more. But, you know, they rely on other tools. Like, they rely on major cloud providers, and they rely on major email service providers that have their own security. But the idea that companies are just kind of, like, flippant about security is not true. 

Boris Segalis: As the organization gets more complex, it becomes a much, much more complicated task as well. Because, you know, think about this. What is kind of like security - right? - if you think about it at a high level. Like, how do you - as a company, how would a company go about addressing an information security program? There's a lot you need to know about what you do as a company. 

Boris Segalis: For example, you need to know - because what is information security, right? You've got to conduct a risk assessment to identify the risks to the data and systems and then identify higher risk areas and then apply appropriate controls to those higher risk areas. Well, to conduct that risk assessment, which in itself is more art than science, I would say, you've got to know what you're conducting a risk assessment against. So that means that you have to have a very, very good understanding of how your company processes data, what systems you use, what vendors you use, what data those vendors get. And as organizations get bigger, it's just not something that's easy for any organization to know because that's not something that they need to know to that level of detail to operate their business, right? Because so much of the business is integration, integration with different services, integration with different cloud providers. 

Boris Segalis: And to do security right, you kind of have to take a step back, write it all out, right out the internet on a piece of paper effectively and then try to go through this process of risk assessment or whatever. Not only that, whenever you go through that process, it only gives you a snapshot of your organization. It's a moment in time. So you have to keep repeating this process. 

Boris Segalis: So at the end of the day, the only, like, organizations that can do that and have the resources to do that are large banks because, you know, if you look at a large bank, they have hundreds of people working in cybersecurity. They will have, you know, a 600-person department and working to detect security breaches or something like that. Incredible amount, number of people - that's what it takes to do it kind of well. 

Boris Segalis: So to expect smaller businesses, emerging companies and others to do it is unreasonable, even though they want to do the right thing, right? So there's a disconnect in that - that, you know, we're always punishing companies that want to do the right thing. 

Dave Bittner: Though, it's interesting. I mean, I sense a certain frustration or resignation on your part that this isn't easier, that there's no easy solution here. 

Boris Segalis: I'm an outside lawyer. Whatever the rules are, I help companies deal with those rules. So the SEC comes out with a new guidance. They're going to require MFA. They require this and that. OK, sure. I'll help companies deal with that. 

Boris Segalis: But if you ask for my opinion on whether these rules - and, you know, nobody cares about my opinion on whether these rules work, just like nobody cares on my opinion whether, you know, the GDPR makes any sense. The company - it's there. It's the law of the land. You have to comply. 

Boris Segalis: But since here we're talking about this substance, I think that there's a difference between what's easy to do politically, which is to put a liability on these companies and kind of suggest that it's their fault. And again, we talked about that example in the SEC enforcement actions. Like, to the outside observer, it's clear these companies screwed up because they - their policy said MFA, and they didn't have it. But the reality is more nuanced. And it is frustrating that where others have talked about the solution of kind of working together, building an army, building an industry and not blaming the companies, that may be - politically, that's really hard to accomplish. 

Dave Bittner: All right, Ben. Interesting conversation, huh? 

Ben Yelin: Yeah, really interesting. You know, I think it's both encouraging and fascinating that the SEC is starting to look at practices of companies and protecting data, private information and requiring certain cybersecurity standards. 

Dave Bittner: Yeah. 

Ben Yelin: I think that's not a role we've traditionally seen for the SEC, which has a role that's more economic in nature. And, you know, I think them venturing into this area is something that's really interesting. 

Dave Bittner: Yeah. Absolutely. All right. Well, again, our thanks to Boris Segalis for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.