Caveat 11.27.19
Ep 6 | 11.27.19

Compliance, regulation and small businesses.

Transcript

Aleksandr Yampolskiy: The No. 1 attack vector for hackers is usually through people. People are the weakest link. 

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland's Center for Health and Homeland Security. Hey, Ben. 

Ben Yelin: Hi. How are you, Dave? 

Dave Bittner: On this week's show, Ben wonders about data collection from big tech companies. I have a story about a GPS tracker case that seems to have spun into absurdity. And later in the show, my interview with Aleksandr Yampolskiy from SecurityScorecard - he shares his views on privacy, legislation, regulations and the crypto wars. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be right back after a word from our sponsors. 

Dave Bittner: And now a few thoughts from our sponsors at KnowBe4. What do you do with risk? We hear that you can basically do three things. You can accept it. You can transfer it. Or you can reduce it. And, of course, you might wind up doing some mix of the three. But consider, risk comes in many forms, and it comes from many places, including places you don't necessarily control. Many an organization has been clobbered by something they wish they'd seen coming. So what can you do to see it coming? Later in the show, we'll hear some of KnowBe4's ideas on seeing in to third-party risk. 

Dave Bittner: And we are back. Ben, I'm going to kick things off this week with a story that I think is a bit wacky. This is from Ars Technica, a story written by Timothy B. Lee. The title is "Cops Put GPS Tracker on Man's Car, Charge Him With Theft for Removing It." OK. So some history here - according to this article, back in 2012, the U.S. Supreme Court ruled that you can't just attach a GPS to someone's car without a warrant. 

Ben Yelin: Yes, United States v. Jones. 

Dave Bittner: Thank you. And so (laughter)... 

Ben Yelin: Always good for a citation here, yeah. 

Dave Bittner: Yes. I know. That's why we have you here. So about a year ago, the state of Indiana was interested in someone they suspected was a drug dealer, and so they got a warrant, and they put a GPS tracker on his SUV. He found the GPS tracker and removed it. And the police noticed that the GPS had stopped transmitting to them, so they got another warrant to search his house, where they found the tracking device. And they also found other things. They found some drugs and some drug paraphernalia, so that was evidence in the initial case that they were going after him for. But he's been charged with both drug dealing and with theft of the GPS device. Now, this strikes me as absolutely bonkers. 

Ben Yelin: Right. So if you or I... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Looked under our car and saw something beeping and making noise... 

Dave Bittner: Right. 

Ben Yelin: We'd probably freak out, think maybe it was a bomb, seek to destroy that device. 

Dave Bittner: Yeah, I'd certainly probably remove it and throw it as far as it would go. 

Ben Yelin: Yes, if not, you know, report it to the cops, who'd be like, oh, yeah. That was us. That was us who put that onto your car. 

Dave Bittner: I guess if I were a drug dealer or I - if I had had, you know, run-ins with the law, maybe that I would hesitate to call the police. But certainly, an unlabeled device - it does not say, property of, you know, Indiana State Troopers or... 

Ben Yelin: Or county government or whatever, yeah. 

Dave Bittner: (Laughter) Right, right. It's an unmarked device attached to the underside of my car, hidden. I don't know if this is law enforcement. I don't know if this is a rival drug gang. I don't know if this is my daughter's boyfriend. Who knows? It could be many, many things. But I know it's no good. 

Ben Yelin: Right. And the chances that it is the government that has transfixed this GPS device to the bottom of your car, compared to anything else that this device could be, is relatively small. Of course, now we know that it was a GPS device. 

Dave Bittner: Mmm hmm. 

Ben Yelin: So the question is, was it legal for the GPS device to be on the car in the first place? 

Dave Bittner: Right. 

Ben Yelin: And then if it was legal for the GPS device to be on the car, can you criminally charge somebody for theft of a device on their own vehicle? 

Dave Bittner: Yes. 

Ben Yelin: So I guess we can kind of take those questions in turn. 

Dave Bittner: Let's do it (laughter). 

Ben Yelin: So the Jones case says that you need a warrant to attach a GPS to a vehicle. 

Dave Bittner: OK. 

Ben Yelin: And the rationale in this case was somewhat surprising for legal scholars. Generally prevailing standard for Fourth Amendment searches had sort of moved away from a physical trespass standard. Justice Scalia, who wrote the majority opinion in this decision, said that the court never fully abandoned that physical trespass standard. And if the government physically trespassed on your property, that was a search whether it violated your reasonable expectation of privacy or not. So as a result of that case, the government does need a warrant to attach a GPS device on your car. Now, in this case, they were able to obtain a warrant. 

Dave Bittner: OK. 

Ben Yelin: The defendant in this case is questioning the validity of that warrant, which is certainly the defendant's right, you know, to confront the evidence against him. 

Dave Bittner: Right. 

Ben Yelin: If the court first determines that, you know, there was something problematic with the warrant application that led to this GPS device being attached to the car, then the defendant is probably going to walk out of court a free man. You can't be charged with theft of something that was transfixed to your property illegally. If somebody's trespassing on my property in any other context, you have the right to do some pretty severe things under our common law system to person. 

Dave Bittner: Right, right. 

Ben Yelin: A man's home is his castle. That applies to our effects, so thing things like our vehicles. Now, it gets a little different if the court were to determine that the government did have a valid legal reason for obtaining that warrant, for affixing this device onto the car. And that's, I think, where this case is going to be won or lost. You know, it's a unique circumstance. Like, the government theoretically does have kind of the equivalent of a license to temporarily use somebody else's property for their own purposes. In this case, the purposes is surveilling where this person goes, whether they're going to their meth dealer's house or whatever. If you read the law literally, that, you know, it is the government's valid license to have a temporary interest in that property, then you possibly could charge somebody with theft for destroying that property. I think that's something I could accept on a theoretical level. 

Dave Bittner: (Laughter) Go on. 

Ben Yelin: Yeah, it seems that the judges in this case wisely are looking at this - the judges on the Indiana Supreme Court are looking at this from a more practical level, that it just seems completely unreasonable for somebody without any context to see this device on their car and not remove it or not seek to destroy it. And it's just, whatever the plain letter of the law is, are we really going to be charging people with theft of devices that are attached - physically attached - to somebody's own property? 

Dave Bittner: Let's play this out together from a really basic point of view so that I'm crystal clear on this. So you come over to my house, and you have a bucket full of dollar bills, and you leave it on my lawn. 

Ben Yelin: (Laughter) Lucky day for you, yeah. 

Dave Bittner: Yes. And let's call it $100 bills. Let's raise the stakes. 

Ben Yelin: OK. 

Dave Bittner: So you leave a bucket full of $100 bills on my lawn, and you walk away. And if I go out to my lawn and I find that bucketful $100 bills, and I take it into my house, have I stolen the bucket of $100 bills? 

Ben Yelin: Well, no, because I've brought it onto your property and left it there. You have a superior property interest than I do in those $100 bills, unless - you know, you could think of some sort of hypothetical where I didn't know it was your property. I tried to drop my $100 bills on the sidewalk. It technically was within your curtilage or whatever. 

Dave Bittner: Right, right. 

Ben Yelin: You know, then there might be some dispute. But if it was actually your property, yeah, I would be losing my property rights in that money. 

Dave Bittner: And that's where I'm going with this because I would consider that the car is my property, and if you leave something on my property, then you've sort of give - you've given it to me, in effect. You've given up your own rights by leaving it on my property. But I suppose what the prosecution is trying to say here is that, given that it was a warrant, that changes things. 

Ben Yelin: Right. So, you know, I love playing devil's advocate. 

Dave Bittner: (Laughter). 

Ben Yelin: That's part of being a lawyer. Hate to defend, you know, our prosecutors here who are trying to claim that somebody stole something off of their own vehicle, but I digress. Let's think of a different hypothetical when somebody actually does have a license to use your property for something. 

Dave Bittner: OK. 

Ben Yelin: So let's say the cable company has an easement; they can wire your house for cable. It's - the cable equipment still technically belongs to them even though it's affixed your property. But they have a legal right to be there. 

Dave Bittner: Ah. 

Ben Yelin: And you destroy those cable wires. 

Dave Bittner: Right. 

Ben Yelin: In that case, because they do have a legal right to be on your property, you potentially could be prosecuted or fined or whatever for destroying that property. Or they could pursue you in civil court, at the very least. 

Dave Bittner: I see. Even though - yeah. So it was their stuff that was on my property, but they were legally allowed to have it there. 

Ben Yelin: Right. 

Dave Bittner: There was a prior agreement, I guess, that that stuff was entitled to be there. 

Ben Yelin: This is one of those cases where the government gets somewhat of a special status, and that's something that the supreme - the Indiana Supreme Court justices said in oral arguments. I think one of the judge - they quoted in this article - one of the judges said, if somebody wants to find me to do harm to me, and it's not the police, and they put a tracking device on my car, and I find a tracking device, and I dispose of it after stomping on it 25 times, I would hope they would not go to a local prosecutor and somehow I'm getting charges filed against me for destroying somebody else's property. I think that argument is pretty compelling. 

Dave Bittner: Now, I guess part of what's going on here is that there's kind of an order of operations where if they lose the warrant, then they lose the GPS tracker. If they lose the GPS tracker, then they lose the second warrant, which is why they went in the house and they found the stuff. So there's this sort of cascading collapse of their case... 

Ben Yelin: Yes, so... 

Dave Bittner: ...If they lose this. So that could be a part of the motivation to pursue this as well. 

Ben Yelin: So in the legal world, we call that fruit of the poisonous tree. 

Dave Bittner: Ah. 

Ben Yelin: If any part of the process that led to somebody's prosecution was obtained illegally, if there was a constitutional violation, then everything obtained as a result of that original legal search is inadmissible in court. So, I mean, this is make or break for the government. You know, set aside the prosecution for theft. If the Supreme Court decides that the warrant was somehow defective, that they didn't actually have probable cause, then all the evidence gleaned as a result of this original GPS tracker device is going to be thrown out, and the case is probably not going to have sufficient evidence to prosecute. So I think that's certainly one element of the case. 

Ben Yelin: But I think it's healthy that, no matter what the outcome ends up being in this case, justices, you know, despite perhaps the letter of the law indicating otherwise, are very skeptical that a person should be prosecuted for finding a tracker on their car and removing it and being prosecuted for theft. I think the instinct there is the right one from these justices, no matter what result they ultimately arrive at. 

Dave Bittner: All right. Well, it's in front of the Indiana Supreme Court. We'll see how it plays out. That is my story this week. Ben, what do you have for us? 

Ben Yelin: So this story comes from The New York Times - actually sent to me by my mother. She thought... 

Dave Bittner: Aw, 

Ben Yelin: I know. She thought it would be... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Perfect fodder for the "Caveat" podcast, and she was right. It was from Farhad Manjoo, a opinion columnist. And he's citing a comprehensive study taken by Pew Research this past summer that looked into how Americans view data privacy. And the findings were sort of eye-opening. The American public sort of accepts that they're in a data privacy hellscape... 

Dave Bittner: Is that your word? 

Ben Yelin: ...For lack of a better word. Yeah, that is my word... 

Dave Bittner: OK (laughter). 

Ben Yelin: ...Although that's sort of the tenor that this op-ed writer is taking. 

Dave Bittner: OK. 

Ben Yelin: In fact, I think he refers to the current landscape as a prison in the last line of this article. Basically, the upshot of this study is that Americans are watched. They know that they're being watched. They're unhappy about being watched. They are extremely skeptical of what the government and private business are doing with their data. And by, you know, massive numbers, they think that the risks of the current landscape, in terms of data collection, far outweigh the benefits, which is pretty eye-opening to me and surprising. 

Ben Yelin: An example of, you know, some of the numbers that the study presented - 81% of individuals think that they have little or no control over the data that they give to companies; 81% of people think that the potential risks of companies collecting their data about them outweigh the benefits - that number drops to 66 when it comes to the government collecting data about them. Eighty percent are very or somewhat concerned about how the data collected on them is used, and strong majorities have little or no understanding about what the private sector and the public sector - the government - does with the data that's collected. 

Ben Yelin: So we're very skeptical. We're fearful. We think that the risks outweigh the benefits. What this op-ed writer pointed out - and this was my natural reaction - is then why do we keep accepting those terms of services and spending our lives online? There seems to be this fundamental disconnect with what we would say to a pollster and our expressed concerns about data collection and our actions. Tech companies are extremely profitable. For the most part, they're very successful. New social media companies are continually popping up. The ones that exist seem like they're going to exist into perpetuity. We are signing over our accumulated data to a greater extent than we ever have before. 

Ben Yelin: The depressing aspect of the story - this is something that the op-ed writer mentions - is that our skepticism doesn't really matter. In his words, we hate the way things are. We've traded our power to stop it, and our worries aren't going to change anything. We're sort of stuck in this system. 

Dave Bittner: Interesting. 

Ben Yelin: Yeah, that sort of jumped out at me. Now, there is a legal connection to this. So as we've talked about, the prevailing test for a Fourth Amendment search is whether it violates somebody's reasonable expectation of privacy. 

Dave Bittner: Yep. 

Ben Yelin: And reasonable expectation of privacy is a societal expectation, so that's something that courts theoretically will have to look at. Do most people think they have a privacy right in, you know, a given piece of data or a given electronic communication? 

Dave Bittner: So does that mean that definition of what is reasonable could change over time and that a different Supreme Court could have a different opinion of it than a previous one? 

Ben Yelin: Absolutely. So you know, reasonableness certainly changes over time, depending on changing technology and the circumstances. The test itself is sort of circular to me. And you know, that's why there's been a lot of criticism of it over the years. For example, this is - comes directly from a Supreme Court case, a dissent. But if the government said, we are now going to open and read every single piece of mail that crosses - you know, that goes through the U.S. Postal Service, you would lose your reasonable expectation of privacy in the mail. And thus, you would not have any Fourth Amendment rights in that mail. That doesn't seem like something we want. 

Dave Bittner: Oh, interesting. Yeah, that is circular. 

Ben Yelin: So it's like the government actually plays a role in setting these expectations. But you know, I think there is sort of a - what this article would reveal to a court, which is somewhat disturbing to me, is that we are sort of aware of how far our data is being spread, how public it is, how what we post on social media, what we put on the internet is going to not only make it into the hands of private organizations but also to the government. And a potential court could look at this data and say - well, whatever criminal defendant may have had a subjective expectation of privacy in this case, but Pew Research did a poll and, you know, most people actually don't share that same expectation of privacy. 

Dave Bittner: So this polling could shift what is considered to be a reasonable expectation of privacy - or a case could be made, I suppose. 

Ben Yelin: Absolutely. I guarantee you that some enterprising lawyer is going to put this polling in one of their briefs, trying to defend a surveillance program or trying to defend one of the tech companies when they're being sued for some sort of release of personal data. And that's something, you know, that's potentially dangerous. I certainly don't blame Pew for conducting the poll. I think it's very useful information. 

Ben Yelin: And I think, you know, these numbers have all sorts of public policy implications, including the fact that, you know, the American public seemingly has the political appetite for increased privacy protection. What they don't seem to have is the appetite to actually curtail the use of social media, electronic devices, etc. 

Dave Bittner: You know, it strikes me as that old - you know, the beatings will continue until morale improves, right? (Laughter). 

Ben Yelin: Morale improves. Yeah, exactly. It feels very Stockholm syndrome to us. 

Dave Bittner: Yeah. 

Ben Yelin: Like, we've sort of embraced our captors here. 

Dave Bittner: To me, I find it a useful analogy to compare Facebook to smoking. You know, everybody knows smoking is bad for you. But smoking - I've never smoked, but for people who have smoked and who tried to quit, my understanding is it's really hard to quit. 

Ben Yelin: It's very hard to quit. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: And I think it's a similar... 

Ben Yelin: We become addicted... 

Dave Bittner: ...Kind of thing, yeah. 

Ben Yelin: ...Yeah. We become addicted. You know, I think it's less this way, thankfully, than it was 20 years ago. But in order to maintain a social life, you know, oftentimes you'd go out and hang out with the smokers and... 

Dave Bittner: Right. 

Ben Yelin: ...You know, when you're with smokers, you're going to smoke a cigarette. I think that's sort of the way social media has become this day and age. Like, in order to stay connected and to be informed, we sort of have no choice but to be on these platforms. 

Dave Bittner: Isn't that interesting? And it's - because if you think about smoking, I mean, it's the smoking bans in - like where we live, in bars and restaurants and so on and so forth, that's contributed to fewer people smoking. So it's a government role - right? - a regulatory role for the public good, you could argue. 

Ben Yelin: Right. Now, of course, I think we're a long way from there when it comes to social media. But... 

Dave Bittner: Yeah. 

Ben Yelin: ...You know, I could see 30, 40 years in the future, if we start to understand the corrosive effect that the social media landscape has on us, maybe there will be public policy considerations. 

Dave Bittner: Yeah. Well, I wonder about social norms, too - because you know, these days, if you, you know, light up a cigarette around a group of people, chances are a bunch of them are going to give you the evil eye and ask you to go somewhere else. And can you imagine a time... 

Ben Yelin: I would love to... 

Dave Bittner: ...When someone whipping out their phone and starting to go through Facebook would get a similar sort of evil eye from people? 

Ben Yelin: Well, I already give that evil eye to people on Facebook... 

Dave Bittner: (Laughter). 

Ben Yelin: ...'Cause I'm like - the news you're probably reading was, you know, spread by a Russian propaganda outlet or... 

Dave Bittner: OK (laughter). 

Ben Yelin: ...You know, someone who's not savvy in reading literature. But yes, for most people, I think there could really be that - a difference in societal expectations. I mean, I think starting with high-profile data breaches - the Cambridge Analytica scandal, the controversy surrounding the 2016 election - we're starting to get these - a greater public consciousness about the risk of social media, the risk of false information, the risk of unlimited data-sharing. And I think we're still sort of in our infancy in figuring out what the implications of that are. 

Ben Yelin: It's, in some ways, encouraging that a sufficient percentage of the public shares these concerns and is realistic about how much data is being collected, how we really have sacrificed a lot of our privacy for temporary convenience - and how the risk of those sacrifices completely outweighs the benefits of getting better targeted ads in our Google feed. I think that's a healthy skepticism. 

Dave Bittner: All right. Well, those are our stories. It is time to move on to our listener on the line. 

0:20:27:(SOUNDBITE OF PHONE DIALING) 

Dave Bittner: Our listener on the line this week is John (ph) from New Jersey, and here is his question. 

John: Hi. This is John from New Jersey. If Facebook gets regulated, will it be the way that radio and TV stations get regulated by the FCC? Are the court going to treat Facebook and other platforms more like publishers or more like common carriers? 

Dave Bittner: All right - interesting question, Ben. What do you make of this? 

Ben Yelin: So Facebook's kind of caught between a rock and a hard place here. Sometimes they want to be regulated like they are a radio or a TV station - by the FCC. And sometimes they don't. There was this recent controversy where Facebook made the decision to not - or would choose to air all political ads, whether they contained falsehoods or not. And they were criticized by a number of political figures, presidential candidates, including Senator Elizabeth Warren. And Facebook responded that they're just following the guidelines that the FCC has set for broadcast networks. Broadcast networks generally do not have discretion as to what political ads they can air, according to FCC discretion. Now, when it comes to other aspects of FCC regulation, Facebook is exempt under Section 230 of the Communications Decency Act. I think you've probably heard of Section 230... 

Dave Bittner: Yes. 

Ben Yelin: ...In news stories over the past several years. They are not liable for content that's posted on their website because they are, you know, considered a platform. 

Dave Bittner: Right. 

Ben Yelin: So in this case, I think Facebook sort of has to be careful what it wishes for. There have been murmurs among the chairman of the FCC. Is it Ajit Pai? Is that how it's pronounced? 

Dave Bittner: It's close. Yep, yep. 

Ben Yelin: Yeah, pretty close. He and a bipartisan set of members have introduced the idea that perhaps Facebook should be regulated more like a broadcaster and not as a simple internet platform. And that would have wide-scale implications in terms of Facebook's evolution as a social media platform. 

Dave Bittner: Yeah, it's interesting. I've seen interviews with Mark Zuckerberg where he says he believes Facebook should be regulated. And I can't help wondering if it's Br'er Rabbit asking to not be thrown into the briar patch. 

Ben Yelin: Yeah. They're sort of in this unique public space right now, where they can always invoke Section 230 to shield themselves from responsibility. And I think as we've started to understand Facebook's sphere of influence and as it's become really, the primary news source for probably a majority of the country - as scary as that is to say - from a public policy perspective, we should perhaps have to bring it under the same regulatory framework that we use for broadcast communications. And you know, that's what the FCC was created for. It was about decency in our broadcast communications, fairness and, you know, a level playing field. 

Ben Yelin: And I think we're starting to understand that Facebook is not just a platform. It's not just a town square, to put it in the physical world parlance. But it's really - it's a curator. It makes editorial decisions about its own content. And you know, in some ways, it's sort of inviting itself to be regulated by the FCC, which, again, it's interesting that Zuckerberg has said that. But you're right, it may be a decision that he would end up regretting. 

Dave Bittner: Yeah. All right. Well, thanks to John for sending in that question. 

Ben Yelin: Thank you, John. 

Dave Bittner: A quick reminder that we would love to hear from you. If you want to send in a question for me or for Ben, our call-in number is 410-618-3720. That's 410-618-3720. You can also send us an audio file. Send it to caveat@thecyberwire.com. Coming up next - my interview with Aleksandr Yampolskiy. He's from SecurityScorecard. But first - a word from our sponsors. 

Dave Bittner: So let's return to our sponsor, KnowBe4's, question. How can you see risk coming, especially when that risk comes from third parties? After all, it's not your risk until it is. Here's step one - know what those third parties are up to. KnowBe4 has a full GRC platform that helps you do just that. It's called KCM, and its vendor risk management module gives you the insight into your suppliers that you need to be able to assess and manage the risks they might carry with them into your organization. With KnowBe4's KCM, you can vet, manage and monitor your third-party vendor security risk requirements. You'll not only be able to pre-qualify the risk, you'll be able to keep track of that risk as your business relationship evolves. KnowBe4's standard templates are easy to use, and they give you a consistent, equitable way of understanding risk across your entire supply chain. And as always, you get this in an effectively automated platform that you'll see in a single pane of glass. You'll manage risk twice as fast at half the cost. Go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. Check it out. 

Dave Bittner: And we are back. Ben, I recently had the pleasure of speaking with Aleksandr Yampolskiy. He is from SecurityScorecard. He's got some interesting thoughts on privacy regulations and the crypto wars. Here's my interview with Aleksandr Yampolskiy. 

Dave Bittner: I wanted to talk about this notion of the need for federal privacy legislation. Where do you suppose we find ourselves today when it comes to privacy legislation, particularly in the United States? 

Aleksandr Yampolskiy: There's no one federal law today in the United States. You have a number of policies that have been passed before, like the GLBA. And most recently, we have the California Privacy Act, the CCPA, which is kicking off in January 2020, which is probably one of the more significant acts similar to what Europe has done with GDPR. But I would say that we're still in the nascent (ph) state. I would say that U.S. is still in the beginning stages of the privacy legislation and policies. 

Dave Bittner: A point that I know you've made is that when this policy happens, if and when it happens, we need to be sure that there aren't too many loopholes. What are your thoughts there? 

Aleksandr Yampolskiy: Well, I think that - you know, I think that you're absolutely correct. You know, for example, you know, if you look at the CCPA legislation, one of the items in the legislation is that you want to prove that a company has reasonable cybersecurity measures. But exactly what does reasonable mean? What's reasonable for one person might not be reasonable for another company. So I think that you want to strike a gentle balance of not being too prescriptive while giving company freedom to choose solutions and decide how to fulfill the policy while at the same time not being too vague so that people will just kind of circumvent it and not act on it. 

Dave Bittner: What's your perspective on where things stand today? When it comes to the big platforms like Facebook and Google and Twitter, where do we stand when it comes to their respect for privacy? 

Aleksandr Yampolskiy: Well, I think the bigger platforms, actually, contrary to some public opinion, may actually do a better job on privacy than some smaller companies. I think that we need to make sure not to forget to ask a question, how do companies make money, right? So, for example, Facebook makes money by advertising to people on a platform. So the more information they're able to process - the more information they're able to analyze about what restaurants you go to, which things you like, which information you share - the more powerful and the more curated the advertising becomes. 

Aleksandr Yampolskiy: Similarly, Google is a double-sided platform, and it derives its revenue through advertising. And the more information it's able to collect about what works and what doesn't work from search results from people's preferences, the more money it generates. So I think that there's always a little bit of a tension because the big platforms benefit from having as much data as possible. At the same time, they need to be cognizant of the privacy legislation. So - but I do think that in that case, it creates healthy, proper tension. But for smaller companies, which might not have the financial resources of Facebook or Google and those companies are just struggling to survive, they might be more tempted to circumvent and eschew (ph) some of those policies. 

Dave Bittner: Now, that's an interesting perspective. How do you suppose that privacy legislation will affect companies of different sizes? Will it have an outsized or oversized impact on smaller companies versus larger companies or maybe the opposite? 

Aleksandr Yampolskiy: I think that you can actually have some historical lessons, which you can compare when you look at how smaller and larger companies deal with certain compliance regulations. You know, for example, if you're a retailer and you take credit cards, then you need to be PCI DSS compliant. And depending on your size, you either do self at the station or you have to hire a professional sub auditor to come and verify your controls. And so I can tell you from just being in the industry and my experience that many small companies, they just check off the boxes. 

Aleksandr Yampolskiy: They say that they PCI DSS compliant, but they really are not. And the reason for that is they just don't have the money. They don't have the time. They don't have the resources to do a good job. And so my fear is that the same could happen with CCPA and other privacy policies. You know, more mature companies will go an extra step in anticipation of CCPA. They will start encrypting personal information. They will make sure that there's two-factor authentication enabled on their tech service. They will make sure that they do proper vendor risk management. But smaller companies are not really going to have time, money or resources to do some of it. Some will, but many won't. 

Dave Bittner: Do you think those companies then are at risk of - should regulators come along and take a look at what they're up to, of falling under large fines or potentially being shut down? 

Aleksandr Yampolskiy: Well, I think that there's always a constraint on resources. Regulators are not going to be able to go take a look at every small company. I think that they're probably going to focus their attention on bigger companies or companies that store larger amounts of personally identifiable information and do spot checks. You know, in general, what's missing from a cybersecurity and also from a privacy industry is a quick measure, like a quick KPI, like almost like a credit score similar to a security score or a privacy score that a company can go compute on somebody and figure out very quickly if they're doing a great job or not. Right now, unfortunately, too many things are happening through very rigorous tests, which take too long to complete. 

Dave Bittner: I want to switch gears a little bit and talk about some of the policy issues when it comes to cryptography. You have a Ph.D. in cryptography from Yale. Where do you come down on the crypto debate, the crypto wars of where we stand in terms of law enforcement's efforts to basically have backdoors? 

Aleksandr Yampolskiy: I think that whenever you start introducing a backdoor into encryption technologies, there's always going to be an opportunity for that backdoor to be misused, right? I mean, imagine that every single house has a special door with a lock in a backyard. And the key to that door, even you don't have it, but it's kept in a local police station. How confident are you going to be that somebody is not going to go steal that key? Like, how confident are you going to be that somebody cannot go and kind of lock pick that back entrance into your house even when you feel like it's closed? And I think the same with the encryption technologies and the same with cryptography. Whenever you start messing with it and you start introducing backdoors, it only makes it less secure. It creates opportunity for abuse by bad guys. So I think personally that it's a mistake from my point of view, again, as a former cryptographer. 

Dave Bittner: And do you suppose this notion that because of the easy availability of strong encryption, that it really won't slow down the bad guys who want to encrypt things, they'll still have access to basically unhackable encryption? 

Aleksandr Yampolskiy: I think you're exactly right because a lot of the encrypt - well, actually many, many encryption algorithms, especially in a public key cryptography, they're published. So, you know, their algorithm, their source code is widely known. There are open source implementations which exist of encryption algorithms, which are very difficult to break. So even if you mandate everybody to go use a particular encryption algorithm and you try to sneak in a backdoor in it, many people, if they really want to communicate something, they will just use a lot of the open source software and still find a way to transmit those messages and information. 

Dave Bittner: Do you think the law enforcement people are talking about something here that's sort of technically unfeasible? I mean, I wonder if it's almost a strawman argument that, you know, that you could have both a workable backdoor solution. It just seems like the policy people and the technical people are at odds here, and they're almost talking past each other. 

Aleksandr Yampolskiy: Well, it is true. I think that the policy people are coming from a good perspective. They're asking a question, how should legislation look like? What should the requirements be in an ideal state? But the problem is, when you get to the technical implementation, there's rarely an ideal state. So, you know, even if you tell everybody you have to use a particular encryption with a backdoor, there's always methods to transmit information, for example, through steganography, right? 

Aleksandr Yampolskiy: I mean, if you and I agree that every Monday I'm going to put up a personal ad in New York Times and I'm going to embed some type of secret messages within that personal ad, nobody else is going to know it, but I'm still going to be able to transmit messages to you. That's how the messages were secretly transmitted in newspapers during World War I and then World War II, in many cases. So I think that policy people come from good intentions, but there needs to be more collaboration and more dialogue with the technical and practitioner community to make it most effective. 

Dave Bittner: What's your advice to people who are in charge of security and privacy for organizations who are looking at the things that are coming down the road, looking toward the horizon at what possible legislation we may be faced with? What sort of preparations can they do now so that when these things come, they're in a good position to respond to them? 

Aleksandr Yampolskiy: Yeah. So if you look at compliance and if you look at cybersecurity, I look at it almost as two circles that overlap in some area with each other. So there's overlap, but there are also differences. You could be secure and not be compliant. And similarly, you could be compliant but be absolutely insecure. There's a treasure trove of different type of regulations. 

Aleksandr Yampolskiy: But, for example, I think that a lot of the time, if you follow the best principles to improve cybersecurity and those best principles include make sure that you use two-factor authentication everywhere, like, you know, because passwords get reused by hackers, they get stolen. So make sure there's multifactor authentication enabled everywhere. Make sure information is encrypted and you have proper rule-based access controls. Make sure that you do security awareness training. 

Aleksandr Yampolskiy: The No. 1 attack vector for hackers is usually through people. People are the weakest link. So spend the time in security awareness training. And finally, last but not least, we all move into the cloud. We're all part of this ecosystem where you have a proliferation of third-party vendors. And so establishing a mature vendor risk management program and figuring out how to actually rate the third parties that you do business with, how to bet that their security controls are adequate, would be my top recommendations for following best cybersecurity practices. 

Aleksandr Yampolskiy: You know, I think it's a good question to ask about, what are the trends that are contributing to global insecurity? Why is it that there's so much money being invested in the venture community worldwide? Billions and billions of dollars into security startups, and yet, the number of data breaches, the number of information being stolen and leaked is increasing. And I think it's important for listeners to know that, and the reason for this is - really, there are three trends that are affecting global insecurity today. 

Aleksandr Yampolskiy: The first problem is very simply - companies spend money on the wrong things. Eighty percent of the budget goes towards reactive solutions that wait until an attacker goes into your organization and then they react. Instead, people need to reallocate the budget towards proactive solutions like intrusion detection, deception technology, threat intelligence feeds, security readings and other type of things. That's reason No. 1. Reason No. 2 is that we operate in a much more complex environment. Information is digitized. There's a proliferation of a third-party ecosystem risk. The market is moving to the cloud. So we deal in a much more complicated environment. That's reason No. 2. 

Aleksandr Yampolskiy: And the third reason, which is the final reason, is that attackers have now access to cyberweapons, which they've not had access to 10 years ago. Just with a stroke of a keyboard, given an email address of a person, you can find out what are all the leaked passwords that a individual reused on different type of websites. With a stroke of a keyboard and paying $50 on a dark web hacker site, you can order a denial of service attack against your competitor. And so all these tools are now becoming available. And so investment into wrong things, attack surface becoming exponentially complex and a proliferation of cyberweapons is really what's contributing to the situation that we're in. 

Dave Bittner: All right, Ben, what do you make of all that? 

Ben Yelin: Well, it was eye-opening. Certainly, the last part of his analysis there is kind of a warning shot to all of us about the state of global cybersecurity. And, you know, I think it would behoove all of us to listen to his prescriptions, both from a policy level and from a personal business level. I also really enjoyed hearing his perspective on federal data privacy legislation. I share his doubt on whether that's feasible politically in the near-term future, as I know we've talked about. 

Dave Bittner: Yeah. 

Ben Yelin: One thing we haven't talked about - and he brought up sort of the effect that data privacy legislation has on smaller businesses - you know, we think that CCPA and GDPR are geared towards the Googles and Microsofts of the world. 

Dave Bittner: Right. 

Ben Yelin: And they largely are. But Google and Microsoft and Facebook can hire the best compliance officers in the world. They can build the best encryption methods. They have the time, money and resources to do that, and smaller companies often don't. So they're faced with some difficult choices. Either you do not comply and you risk some of these severe penalties, or you do comply and, you know, it cuts into your profit margin. 

Dave Bittner: Right. And what if my competitor down the street doesn't comply, and I'm investing all this money in complying? It's a sort of a wait-and-see kind of thing, I suppose. 

Ben Yelin: It is, yeah. That's some - sounds like a prisoner's dilemma. 

Dave Bittner: (Laughter). 

Ben Yelin: I can never remember what a prisoner's dilemma actually is. But yeah, I mean, I think there is sort of a bizarre incentive problem there. And you see this often happen in other regulatory areas, where sometimes a regulation will seem geared towards the big guys, towards the big corporations, but they are the ones who can hire the expensive lawyers, the expensive compliance officers. It's easier for them to actually adopt their practices to comply with a regulation, and it makes things more difficult for their smaller competitors. 

Dave Bittner: Right. 

Ben Yelin: So I think that's, you know, something that's certainly important to consider. 

Dave Bittner: Yeah, unintended consequences. 

Ben Yelin: Absolutely. In terms of his thoughts on the crypto wars - we're going on - what? - six episodes of "Caveat." 

Dave Bittner: Yeah. 

Ben Yelin: And we've had relatively broad agreement from a lot of stakeholders, including the former secretary of the Department of Homeland Security, Michael Chertoff, that there's a lot of danger in creating backdoors and duplicate keys - danger not just into protecting our data privacy but in those keys getting into the hands of a bad actor. And the last thing I'll say is I enjoy getting a little history on steganography. 

0:40:22:(LAUGHTER) 

Dave Bittner: Yes, yes. 

Ben Yelin: Not as well-versed in it as I probably should be, and I'm going to go home and read some articles about how it was used in World War I and World War II. 

Dave Bittner: Yeah. 

Ben Yelin: I'd love to see what they hid in newspaper classified ads. 

Dave Bittner: Yeah, I love listening to Aleksandr speak. His accent adds a little intrigue to it, doesn't it? (Laughter). 

Ben Yelin: It sure does, yeah. 

Dave Bittner: He's quite elegant there in his tone. So our thanks to Aleksandr Yampolskiy from SecurityScorecard for joining us. That is our show. We want to thank you all for listening. 

Dave Bittner: And of course, we want to thank this week's sponsor, KnowBe4. Go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. Request a demo, and see how you can get audits done at half the cost in half the time. Our thanks to the University of Maryland's Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.