Caveat 1.8.20
Ep 10 | 1.8.20

The quiet, behind-the-scenes work of the FBI.

Transcript

Jason G. Weiss: Health care is definitely one of the more dangerous areas that we have to be concerned about when it comes to cybersecurity and cyberattacks. 

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland's Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hi, Dave. 

Dave Bittner: On this week's show, I share a Washington Post story about the data your car may be collecting about you. Ben digs into some recent revelations about government surveillance. And later in the show, my interview with Jason G. Weiss. He's a former forensic expert with the FBI, and he's currently counsel at Drinker Biddle & Reath where he focuses on cybersecurity and privacy law. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be back after this word from our sponsors. 

Dave Bittner: And now some reflections from our sponsors at KnowBe4. What's a policy? We've heard a few definitions. Some say it's an elaborate procedure designed to prevent the recurrence of a single non-repeatable event. Others say it's a way the suits play CYA. Still, others say it's whatever happens to reside on those binders the consultants left behind them right before they presented their bill. How about a more positive approach? As KnowBe4 can tell you, better policy means better security and getting the policies right is a big part of security; so is setting them up in ways that your people can actually follow. We'll hear later in the show how you might approach policy. 

Dave Bittner: And we are back. I'm going to start things off for us this week, Ben. Interesting story from The Washington Post. This was titled "What Does Your Car Know About You? We Hacked a Chevy to Find Out." It's written by Geoffrey A. Fowler. And what they did was they dug in to the sorts of details that a modern car collects on the driving habits of its users. 

Ben Yelin: Spoiler alert - a lot. They collect a lot. 

Dave Bittner: (Laughter) That's right. That's right. In sort of a background here, most modern cars that you buy today include some kind of internet connectivity. If it's not built into the car, if it doesn't have its own Wi-Fi hotspot or most of them have a 4G connection, it can have that connectivity via your own mobile device. So, for example, I bought a car in the last year, and it doesn't have its own connectivity, but when I plug my phone into it, it gets its connectivity from my phone. 

Ben Yelin: Right. 

Dave Bittner: The other thing is the first thing that my car does when I plug my phone in is it starts importing all of my contacts from my phone. And this is so that if I receive a call, the contact information pops up on the screen, which is convenient, but also I don't remember giving it permission to. 

Ben Yelin: You certainly did not. Yeah. It also knows what music you like. 

Dave Bittner: Right. Right. And this article goes into all the data that the car's collecting about our driving habits because modern cars have all of these sensors built in. They know how fast you're going. They have G-force sensors. They measure the temperature. They have GPS sensors. So much of this data is being logged, and a lot of this data is being sent back to the manufacturers. And these manufacturers are selling this data to third parties. The manufacturers say that they anonymize the data. I will tell you that experts on these things often point out that it can be routine and not that hard to deanonymize data if you know what you're doing. 

Ben Yelin: Right. 

Dave Bittner: So that raises some issues as well. But this article really digs into the amount of data that's being collected, the fact that the automotive manufacturers aren't obligated to really share with you what they're doing. And there isn't a whole lot of transparency here. 

Ben Yelin: There's not. You know, we should start out by saying there's no federal law in place that protects you from your automaker collecting this type of information. And the information is beyond just the contacts in your phone. One thing they were able to do through hacking into this infotainment system was get access to a lot of real-time location data or historical location data; so the gas stations that you have been to in the past several weeks. You start to think about how valuable that evidence might be to, say, a police department if, you know, you've been going around committing a bunch of crimes. And we've seen cases come out of this in the past. It's certainly made me happy that I have not yet bit the bullet and gotten rid of my early 2010s non-connected vehicles. 

Dave Bittner: (Laughter) Right. Right. 

Ben Yelin: Now, from a legal perspective, you know, what can you do policywise? California obviously has gotten off to the most promising start by passing their privacy statute. By the time you listen to this, it's going into effect very soon in about a week. And that privacy statute says that any company collecting your personal data must give you, the user, access to what data is being collected. And GM, which is the manufacturer of the Chevy at issue in this article, says that they are complying with the California law, although we don't know exactly what that means. But in other states, it kind of is the proverbial Wild West. There are very few limits on, A, what they can collect about you and, B, whether there's any transparency as to exactly what they're collecting. And I think that presents major privacy concerns, particularly because, eventually, we're not really going to have much of a choice, right? So, you know, my Honda Fit with a million dinks in it is going to die at some point. 

Dave Bittner: Right. Right. 

Ben Yelin: It's going to be, you know, a real bummer, but I'm going to have to get a new vehicle. And you go 20 years out, all these vehicles will be internet enabled. And in order to drive a car, which is one of the most basic things we do, we're going to be forced to submit a lot of very personal data about ourselves. 

Dave Bittner: Yeah. One of the interesting items they had in this story was they bought one of these infotainment computers used off of eBay for a few hundred dollars, and they were able to extract from that used computer all sorts of information about a total stranger - where they traveled... 

Ben Yelin: She likes to call her husband sweetie. 

Dave Bittner: Yeah. From the phone information, they knew there was someone they called, and they also had the photo of that person who was called sweetie. They could see gas stations that they bought gas, restaurants where they ate and some unique information about the phone they were using. One of the things that struck me, and I'll admit this just sounds like something out of a movie script, but, you know, we're all very mindful of our phones and not losing our phones and so on and so forth. You know, I don't know about you, but when I go to sleep at night, my phone is sitting on the nightstand next to my bed... 

Ben Yelin: Same here. 

Dave Bittner: ...Charging. I think that's - it's probably the case for most people. But my car is sitting out in front of the house. And so I would - I thought about if my car contains all of this information about me, if someone wanted that information about me - and, again, I know this is out of a movie, but rather than having to go after my phone, maybe going after my car would be a good first place to look. While I'm asleep, you know, you can hook up my car or take my car away and bring it back before I wake up and gather all sorts of information that I volunteered to my car via my phone. 

Ben Yelin: Yeah. Now, one thing this article does say is that whole process of obtaining that information would require both, like, physical access to the car for a long time... 

Dave Bittner: Right. 

Ben Yelin: ...And more than just general public knowledge about how to hack into one of these systems. So you'd need to have some expertise. If there are expert car hackers driving around your neighborhood, that could be a cause for concern. You know, another thing from a legal perspective is we, of course, come back to what we've talked about a million times on this podcast, which is that third-party doctrine. 

Dave Bittner: Right. 

Ben Yelin: This is the idea that a person does not have Fourth Amendment rights - rights against unreasonable searches and seizures - if they have voluntarily conveyed information to a third party. And that's, on its face, what's happening here. I mean, you probably signed some sort of policy when you purchased the car. Certainly, if you use, like, an OnStar system, you've agreed to their terms and conditions. And you are voluntarily conveying a lot of information to them. 

Dave Bittner: Yeah. 

Ben Yelin: And what the third-party doctrine says is the government can obtain that information without getting a warrant. So, you know, if they even have an inkling, just some sort of reasonable suspicion that you've been going around on a crime spree, they can go to GM with a subpoena and say give us data on all of the locations Dave has been in the last year. 

Dave Bittner: Right. 

Ben Yelin: And you wouldn't need any sort of traditional warrant to obtain that information. This, to me, is why the third-party doctrine seems outdated and limited. For one, it's not really voluntary because, as I said, eventually we're all going to have connected cars. 

Dave Bittner: Right. In order to be a functioning member of society, you're going to have to have these sort of connections. 

Ben Yelin: Exactly. And in terms of the specific information we share, the most recent case dealing with this, which was Carpenter v. United States, in that case, the Supreme Court said that historical cell site data did have Fourth Amendment protection because of the broad nature of the data collected and the fact that it wasn't really collected voluntarily because a person is not actively pressing a button sharing their location data. 

Dave Bittner: Right. 

Ben Yelin: That seems to be exactly analogous to what's going on here. It's not like you press the share my location button. 

Dave Bittner: Right. Right. You're tapping it every 10 minutes... 

Ben Yelin: Exactly. That's just not what happens. 

Dave Bittner: ...Or five seconds or whatever. Yeah. 

Ben Yelin: It's collecting that information from you whether you know it or not as soon as you connect to this car. So this is just another instance where I think that entire legal doctrine needs reconsidering in an age where we submit so much to third parties that could reveal every intimate detail about our lives. And this just to me is your prototypical example of that. 

Dave Bittner: Something I'm going to quote here from the article that I found kind of humorous in its circular sort of logic. And I'll ask you to unpack it. From the article, it says GM's privacy policy, which the company says it will update before the end of 2019, says it may, quote, "use anonymized information or share it with third parties for any legitimate business purpose" - end quote. Such as whom? Quote, "the details of the third-party relationships are confidential" - end quote (laughter). 

Ben Yelin: Yeah. I love that. I mean, so there are a couple of things that they could be trying to say there. 

Dave Bittner: Right. 

Ben Yelin: One is, like, theoretically, when they sell your information to private third parties... 

Dave Bittner: Yeah. 

Ben Yelin: ...There's going to be some benefit for you because it will probably improve the user experience, you know, one way or another. And that's ostensibly why they collect this data in the first place. 

Dave Bittner: Right. Right. 

Ben Yelin: So that's good, I suppose. One thing it could also be saying is we're not going to hand over stacks of data to the government, but if they come to us with a valid subpoena, we're going to comply with them. We don't want to get in trouble with a local police department or the federal government. 

Dave Bittner: And that's the law, right? I mean, they're obligated to do that, so... 

Ben Yelin: Absolutely. And you can understand why they want to keep these relationships confidential because I think if enough users discovered how much information were being shared, those users might think twice about purchasing these vehicles. They might go to the used car lot and buy that, you know, 1992 Toyota with... 

Dave Bittner: Right (laughter). 

Ben Yelin: ...You know, Coca-Cola stains in the back seat. 

Dave Bittner: Well, yeah, yeah, I mean. It's a funny thing to ponder, you know? Could we be coming to this time where either there is a market for old cars that aren't connected or also, you know, I could see going to some back alley chop shop where, you know, you go see the folks who have the skills to anonymize your car and stop it from transmitting and, you know, allow you to operate within the shadows (laughter). 

Ben Yelin: The coolest job that will exist in the next several years. I mean, one thing this article does mention, if you plug in your device, as you said, they're going to collect that data as soon as you connect it to the USB port. And there is an app for you to wipe that data clean when you're done using the car. So, for example, if I go on a work trip and I'm renting a car, I plug in that iPhone, it has access to all the conversations I've had over text message, my terrible music tastes, you know, some embarrassing songs in there probably... 

Dave Bittner: (Laughter) Right, right. 

Ben Yelin: ...And every location that I've been to on that trip. So there is an app - they mention this in the article - called Privacy4Cars. It gives model-by-model directions on how to delete that data permanently after you're done using that vehicle. So there are workarounds. There are going to be more workarounds. You'd think that eventually the market will start to correct itself. If this becomes a big enough problem, then some enterprising person will develop security features that make sure that the data is anonymized and perhaps that minimizes the data that's collected in each individual vehicle. 

Dave Bittner: Yeah. All right. Well, it's an interesting article, again, from The Washington Post. It's called "What Does Your Car Know About You? We Hacked a Chevy to Find Out." We'll have a link to that in the show notes. That's my story this week. Ben, what do you have for us? 

Ben Yelin: This article comes from The New York Times. It was released last week, written by Charlie Savage. There was this major report from the inspector general's office at the Department of Justice on the collection - the electronic surveillance of a former member of Donald Trump's presidential campaign, Carter Page. The way the process traditionally works is the FBI has to obtain a warrant from a secretive court - the Foreign Intelligence Surveillance Court - to conduct electronic surveillance. They put together an application, and the legal standard is the person that you want to surveil has to either - that you have to have probable cause that person is a member of a foreign power or, more commonly, an agent of a foreign power. So they put together this application for Carter Page. It was accepted, and it was subsequently renewed. What came out in this inspector general's report - and this got a lot of publicity - is a lot of the information that went into that application not only turned out to be false, but agents oftentimes omitted what would have been exculpatory information. They altered some of the data to make the application seem less favorable to Mr. Page. Obviously, this got caught up in a political controversy because Page previously worked for Donald Trump's campaign, although the surveillance, I believe, began after he had left the campaign. So this was - this sort of became a story that got bogged down in our political blogosphere or whatever it's called. 

Dave Bittner: Right. 

Ben Yelin: There is a broader lesson here, which I think it's important to look at. We rarely have an opportunity to see what goes into these FISA applications. It is a secretive court. The proceedings are secret. The process is secretive. So we as a society are relatively trusting that the process is going to work, that there's not going to be arbitrary surveillance based on faulty intelligence. And now we have this high-profile case where there seems to have been an illegitimate process that led to surveillance that was not necessarily warranted by the facts. And this does not just apply to Carter Page. That's the lesson I think that has to be taken away from this. I think now that we've looked under the hood, we have to reconsider the entire process in and of itself. The FISA Court itself is very trusting of the FBI. They have a close-working relationship. They will oftentimes ask law enforcement to provide additional information or to edit their application for surveillance in order for it to be reviewed. But there is sort of this level of trust that the FBI is going to be submitting accurate information. And now that we've seen in this high-profile case that they don't, I think that's going to be an opportunity for all of us to reconsider this entire process. 

Ben Yelin: It just came out yesterday that one of the FISA Court judges sent a letter to the FBI and basically said in the most colloquial terms possible this was a major screw up on your part. We are deeply concerned about it. And by the beginning of January, you need to write us and tell us exactly what you're doing to make sure this does not happen again. And, you know, so I think that could have major implications for FISA going forward. 

Dave Bittner: In terms of how this is playing out and there being proper checks and balances and things in place, is this functioning the way that we would hope it would? Do you follow what I'm saying? 

Ben Yelin: Yeah. So it depends on how you would hope the system would function. I mean, we want proper surveillance applications to be accepted because this is a valuable counterintelligence tool. If somebody really is an agent of a foreign power and they're communicating with that foreign power, you know, even if that person is an American citizen, we would like to know about it. Theoretically, we have a process in place so this type of surveillance is not arbitrary. But there are flaws in the process. That's the bottom line here. One of the flaws is, as we've seen here, that the FISA Court has sort of been what some would say overly trusting to law enforcement. Something like 99% of FISA applications are approved. And, again, this statistic might be a little misleading because there's often a back and forth, you know, before the application gets approved. But it is sort of known as being a rubber stamp of a court. There's one particular reason for that. These proceedings are nonadversarial. So the government comes in and they present this application to the FISA court. Here's all the evidence we got. Give us this warrant for surveillance. There's nobody in that room representing the interests of Carter Page. There's nobody in there saying, look - the information you have here is false. We have this alternative documentation that has exculpatory information. You've been disregarding it, ignoring it. The judge should decide based on the legitimacy of each side's case. That does not happen at the FISA court. 

Ben Yelin: Congress passed a law a couple of years ago that said that the court can appoint what are called amicus, friends of the court. So your standard privacy and civil liberties lawyers - maybe from the ACLU - will be appointed to come in and argue on behalf of the Carter Pages of the world... 

Dave Bittner: Right. 

Ben Yelin: ...Those who are being surveilled. But that provision only allows those sort of proceedings on issues that present a novel interpretation of the law. And by definition, the vast majority of FISA applications aren't going to involve a novel interpretation of the FISA statute. So you're just not going to see that many proceedings that are adversarial. And if there is one benefit that comes out of this scandal and this inspector general report, perhaps there's going to be a reconsideration of having a system of nonadversarial proceedings. 

Dave Bittner: I can imagine the folks from the FBI who are looking for these warrants saying, you know, this is just going to slow us down. Time is of the essence. Overall, it's working the way it should. Let's just continue the way we're going here. 

Ben Yelin: Yeah. I mean, the national security apparatus - and largely for good reason - is going to defend the surveillance regime because it is efficient. You know, there is a reason we don't have the same process for foreign intelligence as we do for, you know, normal warrants for electronic surveillance, which require a hearing in front of a neutral judge, a public hearing, and a showing of probable cause that a person is actually committing a crime or, you know, was about to commit a crime. There's a reason that process isn't the same for foreign intelligence. We want to sort of get in on the ground floor before the foreign intelligence threat starts to develop. 

Ben Yelin: So in the example of Carter Page, we'd want to know that he was talking with Vladimir Putin before they actually planned something, you know, that would be against American interests. And I certainly understand that instinct, but when you have this potential for abuse, as we've seen here, I think, you know, it calls for some corrective measures. 

Dave Bittner: All right. Well, it'll be interesting to see how it plays out. Very timely and interesting story there, Ben (laughter). 

Ben Yelin: Yeah. You know, one thing I would say is there are a lot of people, for various reasons, who have been very trusting of the FISA process in the past, and this might be sort of the first time that they're even considering that the flaws in the process, largely perhaps maybe they're sympathetic to Carter Page or to the president, which might just be the political opening the country needs to alter this process. So that might be a positive to come out of this. 

Dave Bittner: Yeah. All right. Well, it is time to move on to our Listener on the Line. 

0:20:40:(SOUNDBITE OF DIALING PHONE) 

Dave Bittner: Our Listener on the Line calls in with this question. Here it is. 

Unidentified Person: Hey, guys. If a police officer stops me on the street, am I obligated to provide my ID? What about if I'm in my car? Thanks. 

Dave Bittner: Interesting question. Ben, what do you make of this one? 

Ben Yelin: So it is a great question. There is a difference whether you are in a car or whether you are on the street. 

Dave Bittner: OK. 

Ben Yelin: In most states, when you're on the street, it is not a requirement that you show your ID, even if a police officer asks you. Now, that is not the case in many states, which do require you to provide an ID, that there is a major difference when it comes to vehicles because as part of a vehicle stop, law enforcement has the right to ask for your driver's license as proof that you can be driving on the road. 

Dave Bittner: Right. 

Ben Yelin: So as a result, you are compelled to give them your driver's license in that instance. If you do not, you could face some sort of criminal sanction. So, you know, I would check your particular state law to see whether you are required to give ID on the street. In terms of vehicle stops, you have to have a driver's license to drive, and you have to present that if you're stopped by law enforcement in any capacity just because we want to make sure that people who are operating vehicles have a license to operate those vehicles. So in a vehicle, you are out of luck if you are trying to conceal your ID. You will have to show it. 

Dave Bittner: What about some of these videos I've seen on online where folks are being kind of adversarial with the police? The people in the car saying, am I being detained? Am I being detained? Am I free to go? Am I free to go? You know, the officer is saying, I want you to pull over, and they barely have their window open and things like that, and they're sort of going round and round about it. What's going on there? 

Ben Yelin: So there's a difference between whether you're required to give them your ID and whether there has been some sort of valid traffic stop. The standard for traffic stops are incredibly low. You know, a broken tail light is justification for a traffic stop. 

Dave Bittner: Right. 

Ben Yelin: And once law enforcement has that justification for the stop, anything else they discover in the process of that stop is going to be, you know, admissible. You know, if they just smell drugs from your car or, you know... 

Dave Bittner: Yep. Yep. 

Ben Yelin: ...Saw cocaine, you know, strewn across the passenger seat... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...They might have just stopped you because of the broken tail light... 

Dave Bittner: Right. 

Ben Yelin: ...But anything they see in there is fair game. In other contexts, where you are not in a vehicle, it is always a legitimate question to ask whether you are being detained. Do I have the right to leave? This benefits you in a couple of ways. If they say you are not, then you are in police custody, which means, generally, they're required to read you your Miranda rights. And so you would be informed that you have the right to remain silent. I've watched enough "Law & Order" episodes to have this memorized. 

Dave Bittner: (Laughter) That's where you get most of your legal expertise, then, through watching TV? Yeah. 

Ben Yelin: Exactly, yeah. I slept through law school. But "Law & Order," that's where it's at. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: And they'll tell you you have the right to an attorney as well. Those rights do not attach if you're just chatting with a law enforcement officer. So if you are free to go, if you are not being detained, you probably should shut your mouth (laughter) if you think that something could come out that would incriminate you. 

Dave Bittner: What your legal rights are. And then there's the real world, where if a police officer - I know for me, if a police officer stopped me on the street and - or, you know, we were having - and said, hey, can I check your ID? There have been times in the past when I didn't really think about it and I was like, yeah, sure, here's my ID. You know, like... 

Ben Yelin: Yeah. I mean it's sort of - it's our natural instinct. They are people with badges and guns. 

Dave Bittner: Right. 

Ben Yelin: So we sort of instinctively want to comply with their orders and what we're being asked. And actually, what a defense attorney would tell you is if you really want to protect your rights, you don't give consent at any point because once you give consent, that gives the police justification to search you... 

Dave Bittner: Yeah. 

Ben Yelin: ...And to do whatever you've consented to. 

Dave Bittner: Yeah. Yeah. And I guess, I mean, I do realize that I have - and I suppose you also share this privilege of being a middle-aged white guy in a relatively affluent suburb where, at no point in my life, have I really had any sort of adversarial relationship with law enforcement. 

Ben Yelin: Absolutely. 

Dave Bittner: So my instinct when it comes to my interactions with them are probably different from a lot of people's. And so, you know, we just need to be sensitive to that. 

Ben Yelin: Now, I'm sensitive to the fact that you called me middle-aged. 

Dave Bittner: (Laughter) I'm sorry. 

Ben Yelin: But getting beyond that for a second... 

Dave Bittner: I speak for myself. Yes, I'm sorry. 

Ben Yelin: Yeah. You can't disentangle... 

Dave Bittner: A youngster like you and - (laughter). 

Ben Yelin: Thank you. Yeah. But you can't disentangle this from broader questions... 

Dave Bittner: Right (laughter). 

Ben Yelin: ...Of bias among law enforcement... 

Dave Bittner: Yeah. 

Ben Yelin: ...Which obviously exists, and it exists in our legal system, too. I mean, police are allowed to stop and frisk people under a reasonable suspicion that they have a dangerous weapon in, quote-unquote, "dangerous neighborhoods." The standard for dangerous neighborhood is often very racialized. 

Dave Bittner: Yeah. 

Ben Yelin: And this is something that has had a disparate impact. So, you know, as much as you'd like to say that rules are rules and that they apply universally, I think... 

Dave Bittner: Right. Yeah. 

Ben Yelin: ...You know, we recognize that in the real world they don't. 

Dave Bittner: Meanwhile - yeah. Meanwhile, back in the real world. 

Ben Yelin: Exactly. 

Dave Bittner: Yeah. 

Ben Yelin: And that's something - you know, you don't need to watch "Law & Order" for. You see it in the news every day, and people have personal experiences with it every day. 

Dave Bittner: Yeah. Yeah. All right. Well, thanks to our caller for calling in with that question. We do appreciate it. We would love to hear from you. Our "Caveat" call-in number is 410-618-3720. That's 410-618-3720. You can also send us an audio file. You can send it to caveat@thecyberwire.com. And send it in. Perhaps we'll use it on the show. 

Dave Bittner: Coming up next, my conversation with Jason G. Weiss. He's a former forensic expert with the FBI. He's currently counsel at a law firm where he focuses on cybersecurity and privacy law. We'll speak to him in just a moment. But first, a word from our sponsors. 

Dave Bittner: And now we return to our sponsor's point about policy. KnowBe4 will tell you that where there are humans cooperating to get work done, there you need a common set of ground rules to ensure that the mission is accomplished but in the right way. That's the role of policy. KnowBe4's deep understanding of the human dimension of security can help you develop the right policies and help you train your people to follow them. But there's always a question of showing that your policies are not only sound but that they're also distributed, posted and implemented. That's where the Policy Management module of their KCM platform comes in. It will enable your organization to automate its policy management workflows in a way that's clear, consistent and effective. Not only that, KCM does the job at half the cost in half the time. It's your policy after all; implement it in a user-friendly, frictionless way. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Jason G. Weiss. He had a career with the FBI. He was a forensics expert. Currently, he is counsel at the law firm Drinker Biddle & Reath, and he focuses on cybersecurity and privacy law there. Lots of interesting stuff Jason has to share here. Here's my conversation with Jason G. Weiss. 

Jason G. Weiss: The biggest issue we always struggle with is getting people to report the crimes to the FBI because a lot of companies are obviously afraid to report or make public the fact that they've been breached or hacked because they feel that will bring bad publicity to the company. And one of the things we work very hard with the victims that we work with was to reassure them that our job in the FBI was to help catch the bad guy, not to try and get in the newspaper and cause them any embarrassment. But obviously, people are nervous when law enforcement comes in. You know, I'd say that was probably one of our biggest challenges because I think we brought a lot of value to the process, but sometimes there were companies out there that would rather take a loss than risk making something go public. 

Dave Bittner: Yeah. It's interesting to me. I almost feel sometimes like the FBI doesn't necessarily get the credit it deserves because so much of the work that you all do, when you were with the agency, is sort of behind the scenes. It is quietly working on those cases and going after the bad guys. 

Jason G. Weiss: I think that's such a fantastic question, and I really appreciate you asking that particular question because the one thing that always brought a level of indignation to the FBI is, whereas - I'll give you an example in the terrorism world or the health care world. It really doesn't matter. We almost never got credit for the cases that we were successful on, which is about 95% of the cases that we work. Most of the time when you hear about the FBI, it's because we've made a mistake or we didn't catch somebody. Bad news tends to make the news a lot better than good news. 

Jason G. Weiss: So I think your question is absolutely appropriate in the sense that you tend to only hear about mistakes or failures, not about, like, all the successes we've had. And terrorism is a great example. For every 20 cases that we solve, you know, if one person does get through and blow something up, that's all you're going to hear about for weeks and weeks and weeks, not the 20 people we took off the streets and the attacks that we prevented. 

Dave Bittner: We hear from the sort of cybercriminal side of things that a big part of the explosion in cybercrime has been that a lot of these criminals feel as though they can operate with impunity - that they're going to be able to do the things that they're going to do and no one's going to come after them, they're not going to get caught. But is that itself a misperception? 

Jason G. Weiss: Absolutely. I mean, I can tell you - I've worked dozens and dozens and dozens of health  fraud type cases, you know, during my career. We take medical fraud - we take health care attack very, very seriously. And we work a lot of them, and we catch a lot of people. Once again, the ones we do well on don't always make the news. But it is a huge priority to the FBI without any doubt at all. 

Jason G. Weiss: I mean, I could tell you - over the last few years, over 70% of all U.S. health care providers in the country have probably been breached in one fashion or another. And that's the highest number of any industry out there. So it is a huge priority and a huge problem for law enforcement. And it's not real different on the civil side. There are a lot of problems the health care industry has to undergo because they have certain weaknesses other industries don't have that makes them far more susceptible, even on the private sector side of the coin, to cybersecurity and cyberattacks. 

Dave Bittner: What kind of weaknesses do they have? 

Jason G. Weiss: Well, for example, I mean, they run hospitals. And I mean, that's a classic example. If they fall victim to a ransomware attack where they lose access to their data, not only do they lose access to their data, which might be critical if they're trying to perform lifesaving surgery, but a lot of times, cyberattacks will shut down equipment. And this equipment could be critical. So hospitals and health care providers in particular are in a unique situation, where they really have to balance the cybersecurity aspects of what they're doing versus the life-or-death decisions that they have to make every day. 

Jason G. Weiss: I mean, if you go into a - the retail clothing industry and, you know, there's a ransomware attack and a store can't sell a shirt, while that's certainly not good for the store, people's lives aren't at stake. But when you're dealing with the health care industry in particular, these attacks are far more dangerous to people because - imagine if it was your day to have, you know, your heart surgery, and there is an attack at the hospital that prevents the hospital from conducting the surgery. I mean, that could lead to death - you know what I'm saying? - in a situation where, otherwise, the surgery would take place. So it's a huge problem. 

Jason G. Weiss: The FBI enforces probably 300 different violations, and I probably worked most of them in my job as - in the technology part of the FBI. And I can tell you health care is definitely one of the more dangerous areas we have to be concerned about when it comes to cybersecurity and cyberattacks. 

Dave Bittner: What about on the regulatory side of things? I'm wondering - how do things like data theft or ransomware, how do they intersect or clash up against regulations like HIPAA? 

Jason G. Weiss: Well, that's a great question. You know, HIPAA secures, as it were - provides a legal protection for people's medical information. But that doesn't help you when the information's stolen - because if the hacker or the bad guy is able to steal your health care information, inside your health care information is a lot of your identity information, which leads to a lot of identity theft. And you know, hospitals have to be able to share this information. So their networks have to really be pristine and secure because you go into one doctor, but that information has to be transferred off to the hospital where the surgery is taking place. 

Jason G. Weiss: It's all done digitally and electronically now. It's - very rarely do you see people carrying folders from one place to another. And if that information is stolen, it really, really opens people up to identity theft. And let's be honest - if you're just recovering from open-heart surgery, is that the time that you want to call and try and figure out why your identity's been stolen? You have much greater problems to deal with in terms of trying to get yourself back to health. So I think your question is great. I mean, ransomware is a little bit different because ransomware isn't stealing data; it's encrypting the data and preventing the health care provider from accessing the data. 

Dave Bittner: Would a health care provider be faced with sort of a double whammy if a bad guy stole some data and then threatened to, say, release it publicly? Obviously, you have the problem of the stolen data. But then would they also be liable for some sort of HIPAA violation because they didn't properly protect the data in the first place? 

Jason G. Weiss: Without a doubt. And it's actually going to become much more severe starting January 1, with the introduction of the California Consumer Privacy Act, which creates a private right of action for companies that are breached. So before, while you had to deal with general breached issues like notification and stuff like that, now you're opening yourself up to private liability. Now, HIPAA data is exempt from the CCPA. But there's other information that is not. 

Jason G. Weiss: So this is really going to provide a lot of exposure - and by that, I mean negative exposure - to every business in California and certainly no different from the health care world. I mean, granted, the HIPAA information itself is exempt. But a lot of the other data, like employee data, customer data that isn't HIPAA related is certainly not exempt. So we're opening up a Pandora's box of liability, without a doubt. I mean, it's opening up a whole new front in the cyberwars. There's no question about it. 

Dave Bittner: You are no longer with the FBI. You're at a law firm now. What sort of advice are you giving your clients to help them prepare for things like CCPA? 

Jason G. Weiss: That's a fantastic question. We are working diligently with clients to help them prepare. The CCPA goes into effect on January 1 of 2020. And I think I read an article recently where 60% of the businesses out there haven't even heard of the CCPA, yet let alone prepared for it. And it's going to open up a floodgate of litigation in my opinion that - is going to do the state a whole lot of good? It's already expensive enough here in California, and I think we're going to be raising the prices just dealing with the cost. I think the guesstimate that the costs of coming into compliance with the CCPA for most businesses are going to range anywhere from $60 to $100 billion alone. And with the private right of action threat hanging over the head, these companies, especially the health care industry, are really going to have to make sure their networks are secure because breaches are going to become a lot more expensive than they were on December 31. 

Dave Bittner: Do you think that these sorts of gestures towards improved privacy are necessary? 

Jason G. Weiss: You know, that's a difficult question. I mean, like everything else, you try to find a happy medium between - you know, I'll give you a perfect example. People ask what we can do in the FBI to keep the country safe. The FBI can absolutely keep the country safe if you're willing to trade convenience for security. And I don't believe in the privacy realm the dichotomy is any different. If you want to keep data safe and private, you can certainly do that, but it's going to be difficult, and it's going to be expensive. Versus the alternative of not securing the data and then dealing with the problems of identity theft, data theft. The key is to find that happy medium. 

Jason G. Weiss: You know, as a business, it's being able to secure your network in a way that your company can secure the data without making the cost so exorbitant that they have to be passed on to the clients, which obviously hurts business other ways, where the cost of doing business goes up greatly. And I think that's going to be one of the flaws in the CCPA. These costs, these - this $100 billion in costs is going to have to be passed down to somebody, and it's customers that are going to be paying for it. So obviously, that's very disappointing. 

Dave Bittner: Yeah. And do you suppose this is going to serve as sort of a test bed for the rest of the nation - you know, where California goes, so goes the rest of the U.S.? 

Jason G. Weiss: You know, I couldn't have said it better myself. The problem is, currently, there is no federal privacy law. And you have 50 states passing 50 laws, which is going to be very difficult for multistate businesses that do business in multiple states. They're going to have to come in compliance with all of these privacy laws. And then when you throw in the GDPR and the European requirements for the businesses that work internationally, privacy is going to become the new norm of the 21st century, and companies are really going to have to put a concerted effort into dealing with privacy as a primary concern with how they conduct their business. 

Dave Bittner: You know, I'm curious. From your perspective both with the time you spent at the FBI and now working in the legal arena, what sort of advice do you have for organizations to better protect themselves? Are there things that you run across regularly where you think to yourself, boy, if only people did this better, we'd all be a little better off? 

Jason G. Weiss: Like, when I talk to clients and when I give presentations, I always tell people there's two types of cyberdefense that you need to be aware of. The first one is your traditional IT defense, making sure your IT department is up to speed and has the proper network configurations. And, you know, we - you know, we try and tell people, look at the NIST standards, look at the ISO standards. Try and fit within those frameworks to keep your network as safe and secure as possible. That's critical. It's critical that the IT folks have set up a network that doesn't have gaping holes in it and so the attack matrix against the network is as slim and as small as possible from a pure cybersecurity standpoint. What we want to do is we want to push out what we call script kiddies. 

Jason G. Weiss: But there are a lot of businesses that buy these security devices and don't realize they're turned off when you buy them right out of the box. You've got to turn them on. You've got to configure them. You've got to learn how to make them work. You do all that, it makes your network a lot safer. But it costs money, and it costs time, and it costs effort. But it's effective. 

Jason G. Weiss: The other - and I think this is just as important as your IT security - is your social awareness and your employee training. And that's something I can't stress enough, is training employees because people realize probably 80% of all cyberattacks take place behind the firewall, which means they're either, quote-unquote, "inside jobs" or you have careless employees or inside, nefarious employees that are helping somebody on the outside. And you really want to work to make sure your employees understand how to prevent malware and ransomware from even entering a system in the first place. 

Jason G. Weiss: And I can give you a great example. When I was in the FBI, one of the cases we worked on was somebody had sprinkled thumb drives around a parking lot of a business. And it was very clever, and it was first time I'd ever seen that. Because they - you know, they rely on good Samaritans like me and you to say, oh, somebody dropped a thumb drive. That's probably important to them. So what your initial thought is let me put this thumb drive into the computer and see if I can identify who owns it. But what you're doing is you're inserting malware directly into the computer now from behind the IT configuration firewalls, right? You've already gone behind the Maginot wall, as they say, and you're putting these thumb drives in the machine. 

Jason G. Weiss: And these are the things we have to train employees and especially in the health care industry to be concerned about because we've got to prevent unnecessary intrusions from happening. And the way to do that is to train your employees that when they get a spear phishing email or a phishing email or even a whaling email to a company executive, then if they don't recognize who that email's coming from or where that - what that link is, is they shouldn't click it because once you execute those programs behind the firewall, that's what is the foundation of both ransomware and malware in a lot of the things that cause health care industries incredible problems. 

Jason G. Weiss: I mean, I don't know if people realize, but in 2016 alone - I was just doing a little research on this - over 11 million health care records were stolen from within the United States. And of the top three breaches that have taken place in this country since 2016, I think - in fact, the top three are all related around health care industry breaches. So I mean, obviously, health care industry is a - the health care industry is a huge target. 

Dave Bittner: Yeah. 

Jason G. Weiss: And you've got to be very, very careful. And I'll you, there's one other area where I think health - the health care industry really needs to focus its security concerns on is - it's kind of a new phenomenon called medjacking. I don't know if you've ever heard of that term. It kind of evolves from what's called the internet of things. You know, like, you put a thermostat in your house or a refrigerator on the internet, the security on those things are terrible. So people could hack into your thermostat or your refrigerator because there's almost no security whatsoever. 

Jason G. Weiss: Now, the problem with medjacking is hackers have figured out a way to hack into prosthetic devices, pacemakers and devices provided by hospitals and are given to people to help keep them alive and convenient. And the problem with these devices is a lot of these devices don't have very extensive cybersecurity measures built into them because, really, they were not designed for that purpose. But can you imagine what would happen if somebody could hack into somebody's pacemaker and the control they would hold over that person in terms of threats? 

Dave Bittner: Yeah, it's a totally different level of ransomware. 

Jason G. Weiss: Absolutely. I couldn't put it better myself. Because imagine if you wear a pacemaker and somebody is able to hack in your pacemaker and say, if you don't pay me, I'm going to turn your pacemaker off. I mean, that could literally kill you. So medjacking, to me, I think is going to be one of the big cyberthreats you're going to see coming probably over the next few years because as companies and especially with all the military prosthetic devices and stuff like that, a lot of these devices have chips that have, you know, firmware and BIOS and have these type of items already included in there, but the security on them is not very astute. And so I think they're going to open themselves up to potential liability and other problems if we're not able to create medical devices that have some reasonable level of cybersecurity defenses built into them as well. 

Dave Bittner: All right. Ben, what do you think? 

Ben Yelin: Well, always good to hear from somebody who has a wealth of experience in public and private sectors. 

Dave Bittner: Yeah. 

Ben Yelin: Working over 800 cases that he said have some sort of cyber element to it. It's not often that we get to talk to people who have that sort of experience... 

Dave Bittner: Right. 

Ben Yelin: ...With cyber-related litigation. So very thankful for that. Just a couple of interesting things that stuck out to me. There's this misconception that he talked about, that cyber criminals are sort of immune. They're not going to get caught. It's, you know, an area of criminal law where you can avoid detection. You know, as he said, the FBI catches a lot of people. We don't hear about it as much because, per our earlier segment in this very podcast, we talk about it when they screw up and not when they do something right. 

Dave Bittner: Right. Right. (Laughter). 

Ben Yelin: I think that's very fair to point out. 

Dave Bittner: Yeah. 

Ben Yelin: And then, you know, the other thing that stuck out to me is how damaging cyber incidents can be particularly in the health industry, not just in the quantity. And I guess I hadn't seen the statistic that 70% of health systems have been breached in the past several years. I don't know exactly what the timeframe was. 

Dave Bittner: Yeah. 

Ben Yelin: That's a very high percentage. And then, you know, what the consequences of that could be when we talk about shutting off people's pacemakers. Like, this - these are life-and-death decisions. It is such a - it's a large sector. I think it's one sixth of our national economy. 

Dave Bittner: Wow. 

Ben Yelin: And it is uniquely prone to these threats. So I think that was eye-opening. 

Dave Bittner: What do you think about what he was talking about with CCPA and how that could potentially open the floodgates for litigation? 

Ben Yelin: I think it's accurate. As he said, the private right of action is going to be something that bugs the living you know what out of the tech companies. They are under great incentive to make sure their systems are secure because any person who suffers from a data breach of one of these companies is going to have the right to sue under California law, and it really could cause a floodgate of litigation. The argument is whether that floodgate of litigation is worth it. 

Dave Bittner: Right. 

Ben Yelin: Is it valuable to the consumer? There are certainly arguments both ways. You are holding these companies accountable, and you are providing them increased incentives to take appropriate security measures. 

Dave Bittner: Yeah. 

Ben Yelin: But there's also, you know - and this applies to basically all torts - there's a tort tax, you know, and the tax is the cost of compliance. So every dollar they have to spend on increasing their compliance is going to end up costing the consumer one way or another. So it's just - sort of depends on your particular persuasion as to whether the costs outweigh the benefits. There is a reason why the other 49 states have not passed versions of the CCPA. I mean, it largely has to do with the fact that it does bear significant costs, whether those costs are worth it or not. 

Dave Bittner: And time will tell. We don't know how this will come out on balance. 

Ben Yelin: No. 

Dave Bittner: We'll have to see. 

Ben Yelin: No, it will be one of the intriguing subplots of the 2020 season of California. 

Dave Bittner: (Laughter) Right. Right. Right. All right. Well, again, thank you to Jason G. Weiss for joining us. We want to thank all of you for listening. That is our show. 

Dave Bittner: We want to thank this week's sponsors, KnowBe4. Go to knowbe4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. You can request a demo and see how you can get audits done at half the cost in half the time. Our thanks to the University of Maryland's Center for Health and Homeland Security for their participation. You can learn more at mdc.hhs.com. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.