Ben looks at the cozy relationship between Ring and local law enforcement, Dave shares a story about a DNA tests and search warrants. Our listener on the line wonders about deleted emails. Our guest is Michael Chertoff, former US Secretary of Homeland Security, now head of the Chertoff Group.
Links to stories:
- Ring Gave Police Stats About Users Who Said ‘No’ to Law Enforcement Requests
- Your DNA Profile is Private? A Florida Judge Just Said Otherwise
Got a question you'd like us to answer on our show? Leave a message at (410) 618-3720.
Michael Chertoff: [00:00:10] The jihadi groups moved to smaller scale attacks or simply inspired attacks, where they use social media or other kinds of communications to induce people to carry out an attack with a gun or an automobile or a homemade bomb.
Dave Bittner: [00:00:29] Hello, everyone, and welcome to Caveat, the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: [00:00:39] Hello, Dave.
Dave Bittner: [00:00:40] On this week's show, Ben looks at the cozy relationship between Ring and local law enforcement. I'll share a story about DNA tests and search warrants. And later in the show - my interview with Michael Chertoff, former U.S. secretary of Homeland Security. We want to remind you that while this show covers legal topics and Ben is a lawyer, the views expressed did not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be back.
Dave Bittner: [00:01:06] But first, a word from our sponsors, KnowBe4. And now a few thoughts from our sponsors at KnowBe4. What do you do with risk? We hear that you can basically do three things - you can accept it, you can transfer it, or you can reduce it. And of course, you might wind up doing some mix of the three. But consider - risk comes in many forms, and it comes from many places, including places you don't necessarily control. Many an organization has been clobbered by something they wish they'd seen coming. So what can you do to see it coming? Later in the show, we'll hear some of KnowBe4's ideas on seeing in to third-party risk.
Dave Bittner: [00:01:48] And we are back. Ben, why don't you start things off for us this week?
Ben Yelin: [00:01:52] Sure. So there have been a series of articles on the website Gizmodo relating to Amazon's Ring technology, which is that security camera smart device that you post on the front of your house. You can surveil your neighbors while they walk by.
Dave Bittner: [00:02:05] (Laughter) Well, that's one use for it.
Ben Yelin: [00:02:07] That's one use of it. Obviously it has very legitimate law enforcement and personal security...
Dave Bittner: [00:02:13] Right.
Ben Yelin: [00:02:13] ...Purposes.
Dave Bittner: [00:02:14] Right.
Ben Yelin: [00:02:14] First and foremost, if I want to know who's at my door when they literally ring the doorbell.
Dave Bittner: [00:02:18] There's the name.
Ben Yelin: [00:02:19] There's the name. That's where it comes from.
Dave Bittner: [00:02:20] Yep.
Ben Yelin: [00:02:20] So this series of articles has talked about the evolving relationship between Ring, which is now owned by Amazon, as it has been for the past year, and local law enforcement departments. The article that piqued my interest was about how Ring had provided the police in a Florida jurisdiction with statistics about the users who said no to various law enforcement requests. So there's this app called Neighbors. It's completely voluntary. If you own a Ring device, you download the app, and you can voluntarily help police solve crimes by sharing your Ring surveillance video.
Dave Bittner: [00:02:54] Yep.
Ben Yelin: [00:02:55] And what Ring had done is give information - or this is at least what's alleged in the article - they gave information to local law enforcement as to whether users had responded for Ring's request to submit that data. Now again, this is entirely voluntary. Users can opt out of participating in the Neighbors app, although I can imagine there's some sort of social pressure that's involved in that.
Ben Yelin: [00:03:18] This sort of opened a Pandora's box for me in terms of Ring's close relationship with various law enforcement agencies across the country. They talked about one jurisdiction in California where Ring started a program where they were providing Ring devices free of charge to - as a promotion through the city to some of that city's residents. They've also sort of been coordinating their messaging. So as part of Ring contracts with various local police departments, the police departments are not allowed to describe Ring devices as a form of surveillance. And I think that's important for their branding purposes. But, you know, it's interesting to me that law enforcement has been willing to abide by that.
Ben Yelin: [00:03:58] This article that I have sort of gave us a window into just how ubiquitous user-submitted data is. So they did a study of Fort Lauderdale, Florida. They said that the police had a 3.5% success rate when requesting footage. So there were 319 videos that were sought by law enforcement. They were only given permission to view 11. So that data was submitted by Ring to law enforcement.
Dave Bittner: [00:04:24] That's interesting. That's actually lower than I would expect.
Ben Yelin: [00:04:27] It was eye-opening for me, too. Part of it is people might just - might not be plugged in enough to respond to these requests.
Dave Bittner: [00:04:33] Yeah.
Ben Yelin: [00:04:33] But then there are obviously very legitimate civil liberties concerns. It can feel very invasive, even for a voluntary program.
Dave Bittner: [00:04:40] Right.
Ben Yelin: [00:04:41] And there might be a perverse effect of seeing that law enforcement has requested the data because that might trigger an somebodies brain, oh, like, the police can actually get a hold of this. Like, what happens if they discover something that I've done that's been captured on my Ring security camera?
Dave Bittner: [00:04:56] Right. Right.
Ben Yelin: [00:04:57] And so that actually might be a deterrent effect. So even though the data that's being sent to the police department is not individualized, it still sort of gives an indication to Ring customers that law enforcement can potentially have access and has a ongoing, persistent relationship with a Ring and with Amazon. And so I think, you know, that might awaken some people to potential privacy and civil liberties concerns.
Dave Bittner: [00:05:23] Yeah, this is really interesting to me. I have a lot of neighbors who have Ring devices. I do not. And I have no plans to get one. But you don't have to be a Ring customer to install this app on your phone. And I have the app on my phone. And so I will regularly get alerts from neighbors or people in my - not my immediate neighborhood, but the area around where I live, and they're posting videos of strange people walking by their houses. And some of them are legitimate things that you would be concerned about - people going through cars late at night, you know, if someone leaves their car unlocked. But other ones are, like, somebody walks by in front of the house and...
Ben Yelin: [00:06:01] Looks suspicious, yeah.
Dave Bittner: [00:06:03] Yeah, looks suspicious is in air quotes, right? (Laughter).
Ben Yelin: [00:06:06] Yes.
Dave Bittner: [00:06:06] But I agree with you that this cozy relationship between the police and Ring gives me a little bit of pause. At the same time, I can understand how this could be a good thing for law enforcement to try to establish a case. If something happens in a neighborhood and you have this sort of net of these video devices all over the neighborhood and you can establish where someone was at different times throughout the day or night, well, that's a law enforcement officer's dream, isn't it?
Ben Yelin: [00:06:35] It is. And the CEO of Ring - and maybe this is a statement that he or she - it's Jamie Siminoff - regrets at this point. But they said the company's goal is to have every law enforcement agency on the police portal. So I mean, it's an indication that although Ring is not intended to be a surveillance system, that could be one of its practical effects.
Ben Yelin: [00:06:52] Now, I've seen it used in our neighborhood, you know, Facebook group to solve crimes, and you have to recognize the value that it would provide to law enforcement. It's crowdsourced. Its coverage has become broader as more people get these devices. It goes places that traditional, like, blue light security cameras wouldn't be able to access, like just in front of a person's house, as opposed to, like, an overhead view of the street or something like that. So you understand the law enforcement value.
Ben Yelin: [00:07:19] From a perpetrator's perspective, you obviously have no rights - whatever video surveillance is taken as you walk by people's houses. You are out in public; just as it would be with any surveillance camera, you've relinquished your reasonable expectation of privacy. But for the users, I think there are significant - even in a program where it's voluntary to share information. The company's reiterated that it's voluntary. Law enforcement has reiterated it. There might be a chilling effect to how much you share with law enforcement if you are aware of the close relationship between the manufacturer of the technology and the law enforcement agencies that you might not trust to begin with.
Ben Yelin: [00:08:01] The other element that I think is particularly alarming to people is Ring's now owned by Amazon, and Amazon has developed the most sophisticated facial recognition technology in the industry. And there's sort of this fear that over time Ring will start to integrate facial recognition. So not only can law enforcement request surveillance taken from a Ring device, but they would also be able to use facial recognition software to potentially identify that suspect. This is something that's sort of in its early development stages, but you can obviously see why that would pose a big red flag. There are obviously (laughter) many civil liberties problems with facial recognition software.
Dave Bittner: [00:08:39] Right.
Ben Yelin: [00:08:39] Like all sort of artificial intelligence, it's amazingly even more racist than we are...
Ben Yelin: [00:08:47] ...Which, you know...
Dave Bittner: [00:08:47] That's an interesting way to put it.
Ben Yelin: [00:08:48] ...Given humanity's history, is a difficult feat to accomplish. So, you know, that's when the relationship between Ring and law enforcement might lead to something that's more problematic.
Dave Bittner: [00:09:00] I wonder, too, about if, for example, law enforcement have an idea of who within a neighborhood do or do not have one of these Ring cameras. And that could be as easy as walking the neighborhood and looking at everybody's front doors because they have a distinctive look to them.
Ben Yelin: [00:09:15] Sure. So that would be the harder way to do it. The easy way to do it, which is what a lot of cities have done, has been to set up sort of Ring discount program. So sign up. Give us your name and address. We'll give you a discount on this Ring device, you know.
Dave Bittner: [00:09:29] But then are you obligated to share your footage in exchange for the discount?
Ben Yelin: [00:09:32] So you're not obligated to share your footage with them; they know who has Ring devices.
Dave Bittner: [00:09:37] I see.
Ben Yelin: [00:09:37] So they would know who to contact, and they have - you know, you're on the proverbial list.
Dave Bittner: [00:09:42] Right, right. And I just imagine a police officer knocking on my front door and saying, hey, we're trying to solve a crime, and we really need your help here. That it's - that could be a hard request to turn down, even if you are someone who, you know, generally is not interested in sharing your information with the police. A one-on-one request like that, I could feel like they have influence on me to turn over that footage.
Ben Yelin: [00:10:08] Absolutely. It's always a very fine line between what's voluntary and what's being compelled, and this, I think, is very illustrative of that line. Technically, it's voluntary. But, you know, by refusing to comply with this request, you're potentially making enemies out of your neighbors because I've been on my neighborhood Facebook group. Everyone says, have you seen a guy with a blue hat and a gray shirt? Like, there's sort of this expectation among the community that if you have a piece of information, you'll bring it forward.
Dave Bittner: [00:10:36] Right, right (laughter).
Ben Yelin: [00:10:36] So there's sort of that communal pressure. And then you also potentially don't want to be on law enforcement's bad side.
Dave Bittner: [00:10:43] Right. And meanwhile, you're sitting on your couch in your blue hat and gray shirt, wondering what everybody has against you (laughter).
Ben Yelin: [00:10:48] Yeah, exactly. And why did I have to walk on that block where everyone had a Ring device?
Dave Bittner: [00:10:52] Right, exactly (laughter). You're just walking your dog.
Ben Yelin: [00:10:54] I was just walking my...
Dave Bittner: [00:10:55] Minding your own business.
Ben Yelin: [00:10:56] Exactly. I would say the broader lesson we can take from all of this is surveillance is so pervasive now it's sort of infested its way into all aspects of our life. And for each instance where we have this new, digital style of surveillance, where law enforcement could potentially gain access to it, you're going to run into these problems where the information is so valuable, and the reason it's valuable is because it's broad, it's comprehensive, it has a very wide reach. And so there's going to have to be this balance between giving law enforcement the information they need to solve crimes, which is obviously of the utmost public interest...
Dave Bittner: [00:11:32] Right.
Ben Yelin: [00:11:32] ...And making sure that users of these devices have their privacy protected, and it's a really, really difficult line to straddle.
Dave Bittner: [00:11:39] Yeah. It's interesting to watch it play out in real time. I can't help wondering how it will settle down by the time these sorts of things affect our young kids.
Ben Yelin: [00:11:49] Absolutely. You never know that - technology is going to be in 30 years.
Dave Bittner: [00:11:52] Right.
Ben Yelin: [00:11:52] It's probably going to be something we couldn't even dream up in our wildest imagination. But, you know, that's what makes this area of law and policy so interesting, you know. It's not like we have a lot of settled case law related to smart devices that people use as doorbells and security cameras. It's - the devices are relatively new, the technology's new, and these relationships are going to have to be defined going forward.
Dave Bittner: [00:12:18] Yeah. All right, well, it's an interesting story and certainly one that's developing. We'll keep our eye on it.
Ben Yelin: [00:12:24] Absolutely.
Dave Bittner: [00:12:25] My story this week comes from The New York Times. This is an article by Kashmir Hill and Heather Murphy. It's titled "Your DNA Profile is Private? A Florida Judge Just Said Otherwise." Now, Ben, you and I have talked on the CyberWire about DNA tests. They've been used to solve some serious crimes. There was the case back in 2018 where they tracked down the man that they believe was the Golden State killer.
Ben Yelin: [00:12:50] Right.
Dave Bittner: [00:12:50] And they were able to...
Ben Yelin: [00:12:50] That's one of the - that was a remarkable story just because it had been one of those unsolved mysteries you'd see on, like, an old 1990s episode of "Dateline." I mean, it was so compelling. And then all of a sudden, it comes forward because of these genealogy databases.
Dave Bittner: [00:13:05] Right. And they were able to use DNA that had been submitted by this alleged perpetrator's relatives to connect back to this person.
Ben Yelin: [00:13:16] I've already called all my relatives and told them, do not use any of these genealogy databases; you're going to get us all arrested.
Dave Bittner: [00:13:23] (Laughter) I fear it's too late. And we'll get to the value proposition for different folks in a minute.
Ben Yelin: [00:13:27] Right.
Dave Bittner: [00:13:27] But what this story is about is that a Florida detective, recently, in the midst of a presentation at a police convention, shared that he had obtained a warrant to penetrate one of these DNA databases and search its full database. And they had nearly a million users on it. This was one of the smaller DNA database companies. This is GEDmatch, is the name of this one. And what's interesting to me about this is that the judge gave him a warrant, and this allowed him to search even the data of people who had opted out of sharing their information with people like law enforcement.
Ben Yelin: [00:14:04] Yes. So that particular company, GEDmatch, has an opt-out provision. Some of the other companies that you've probably heard of that do this genealogical work, like Ancestry.com and 23andMe, have even more strict policies, where they've pledged to - no matter what - keep user's genetic information private. It's one of those things where you can only keep it private as long as law enforcement does not get involved, and when they're able to obtain a warrant, even if you've made it clear to your customers that you are protecting their data, your hands might be tied.
Dave Bittner: [00:14:35] So it's going to be interesting to see where this goes. The detective who did this search, who got this warrant, his name is Michael Fields. He's from the Orlando Police Department. He says that this will likely lead to them requesting warrants from the larger DNA companies - the 23andMe the Ancestry.com And my opinion when it comes to these sorts of things has been pretty much along the lines that I'm OK with you selling that to a judge, and a judge saying to the police officer, OK, we think this is worth doing, and here's your warrant. But I think it's really important for the police officer to have to make that case, that the police officer doesn't have just open access to the whole database.
Ben Yelin: [00:15:15] Yeah. I mean, that's what's particularly problematic to me because the perpetrator that they're going to find did not voluntarily submit their DNA, or at least most likely did not voluntarily submit DNA. It was submitted most often by very distant relatives who the alleged perpetrator probably had no relationship with.
Dave Bittner: [00:15:33] Yeah.
Ben Yelin: [00:15:34] So even though they're getting a warrant to search the database for evidence for solving a particular crime - and that would meet the traditional definition to satisfy the warrant requirements - you get into this hazy gray area where there's been no voluntary submission to these third parties, to these genealogy companies by the person who is the subject of the warrant. And so, to me, that's very problematic.
Ben Yelin: [00:15:59] The other thing that's problematic is users of these agreements will often see language that says something like, we would never share your data under any circumstances, unless compelled by a law enforcement agency, you know, through a warrant. And most of us read that and say, that means they're never going to share our data. Now we have this one case where the police did the legwork. They were able to secure the warrant, although we don't know exactly what went into that warrant application.
Ben Yelin: [00:16:24] And so there might now be an increased awareness that just because they say they are going to protect your data, once the police come with a warrant, there's very little they can do as companies to stop it. And it'll affect their reputation in terms of protecting personal privacy, and that will probably end up affecting their bottom line.
Dave Bittner: [00:16:43] Yeah. And you know, that's a really interesting point because after this story broke, I saw a lot of people, particularly information security people, on Twitter and other places saying, hey, everybody, just a reminder - please, please don't submit your DNA to these databases because these are the things that can happen. Now, I understand that impulse, and I certainly respect anybody who doesn't want to submit their DNA for all sorts of good reasons.
Ben Yelin: [00:17:11] But, hey, you know what? If my great uncle's third cousin didn't want to get caught, they shouldn't have committed that crime in the first place, right?
Dave Bittner: [00:17:17] Well, but I think - yeah, but even beyond, you know, criminal stuff, it's the idea that your biometric information is forever.
Ben Yelin: [00:17:25] Yes.
Dave Bittner: [00:17:25] And it can't be altered. You can't - like, you can change a password; you can't change your DNA.
Ben Yelin: [00:17:29] You cannot.
Dave Bittner: [00:17:29] But I - just to sort of give everyone a friendly reminder that there are groups of people for whom these tests have a different value proposition. I have family members who are adopted and have found family through these things that they otherwise would not have found. And so...
Ben Yelin: [00:17:47] That's interesting. I mean, one of the things I was going to say is that mostly in third-party records cases, you're dealing with something where the user doesn't really have meaningful choice. So, you know, we've talked about call detail records. Like, in modern society, we generally don't have any agency as to whether we use a telephone. We have to for modern communications.
Dave Bittner: [00:18:07] Right.
Ben Yelin: [00:18:07] In order to carry on in modern life, we have to use some sort of financial services. So we're going to generate bank records. What I was potentially going to say is, at least for most people, submitting your saliva for DNA testing sounds like something you would do out of curiosity; it's not something that would be a necessity. But what you bring up, I think, is a very valid point, you know. Even if 99% of people who submitted their saliva for a DNA test were just curious about their genealogy, wanted to know, you know, what percentage Irish they are, what percentage Eastern European they are...
Dave Bittner: [00:18:41] Right.
Ben Yelin: [00:18:41] You're going to get those cases where it's for something more meaningful. And so I think that's something that we really have to keep in mind as people who are trying to balance these privacy concerns.
Dave Bittner: [00:18:51] Yeah. I think I saw another statistic today. It was something along lines that - we're at the point now with the number of people who have done these tests that they can identify about 60% of the population, they think, by this sort of web of DNA tests that have been done. And you know, every...
Ben Yelin: [00:19:08] Of the world population?
Dave Bittner: [00:19:09] Of the U.S.
Ben Yelin: [00:19:10] Of the U.S. population, yeah.
Dave Bittner: [00:19:10] Of the U.S. And it seems to me that, you know, every holiday season, you get - more and more of these gets purchased as gifts. And so you sort of get a surge right after the holidays where that web gets filled in a little tighter, and you get more connections made every year. So I wonder, are we going to approach a point where it doesn't really matter if you submit yours or not because so many other people have that it can be determined who you are just by inference of all the DNA tests that are already in there?
Ben Yelin: [00:19:39] Right. Now, the counter to that is, what if there is a reverse movement because of these high-profile cases? So now we have this California case in 2018 where they solved this Golden State killer mystery through the use of this genealogy website. As more of these cases become high-profile, maybe people in the information security world and elsewhere might think twice about submitting their DNA, and it won't be as exponential as we would have expected. Probably just the fact that this article was printed in The New York Times made people think twice for the first time about the potential consequences to themselves and to society writ large about taking this test. And I think that itself could have a huge impact.
Dave Bittner: [00:20:20] Yeah. Yeah. All right, fascinating stuff, for sure. It is time to move on to our listener on the line.
0:20:26:(SOUNDBITE OF PHONE DIALING)
Dave Bittner: [00:20:30] This week, we've got a listener who calls in with a question about email and being a small business owner. Here's the call.
Jessica: [00:20:38] Hi, guys. This is Jessica (ph) calling from Denver, Colorado. I'm a small business owner, and I'm wondering what my liabilities are when it comes to deleting or saving emails. Am I better off deleting messages that I no longer need, or should I keep them as documentation of exchanges with customers and clients? Can I get in any legal trouble for deleting these emails? Thank you.
Dave Bittner: [00:20:56] Interesting stuff. What do you think, Ben?
Ben Yelin: [00:20:58] So it depends on the - it's a great question. It certainly depends on the content of the emails. If it's one of those, hey, just want to get your attention about this meeting we're having next week, I think in the vast majority of circumstances there will be no consequences for deleting those emails. Where it starts to get more complicated is if emails contain certain records that are protected under federal law. So emails pertaining to business and financial records fall under a variety of federal statutes that require retention for a certain period, and that will depend on those type of financial records.
Ben Yelin: [00:21:31] There's something like the Income Tax Assessment Act. So anything relating to tax records, for example, has to be retained for five years. Something like employee records - so even, like, the initial communications between an employer and a potential employee, all those probably have to be maintained under either federal law or contractual agreements between the employer and the employee, or if the employee is a member of organized labor or member of a union, that might be a contractual obligation. Anything relating to, like, corporate records, that's obviously going to need to be retained for tax purposes and for legal liability purposes.
Ben Yelin: [00:22:08] So it really is going to be heavily dependent on what's contained in those emails. As a rule of thumb, anything that wouldn't constitute a record, something that couldn't be interpreted as something that would be useful to, I would say, the government or anybody else who might want to get access to something for tax purposes, corporate structure purposes, probably OK to delete it. Anything that has that type of information in it should be retained. And it's one of those things - the laws around data retention are complicated enough that it's probably a good idea to have an attorney on retainer, if you can get one, to give you exact advice on compliance.
Dave Bittner: [00:22:46] Yeah. I suppose it's safe to say that if you find yourself in some sort of legal situation and there are emails that might be relative to that, that the thing not to do is to start frantically deleting things.
Ben Yelin: [00:22:57] Yes. It's always...
Ben Yelin: [00:22:59] I mean, I would say do not delete mindlessly because it's always better to have those records, especially, you know, if you are subject to an IRS audit...
Dave Bittner: [00:23:07] Right.
Ben Yelin: [00:23:07] ...And there's going to be some evidence contained in those emails. Your life is going to be extremely, unnecessarily difficult if you just decided to purge emails for no reason.
Dave Bittner: [00:23:15] Right. And they could hold that against - I mean, in terms of getting in trouble, I suppose some people could see - could say that the very fact that you started frantically deleting things points to the fact that you - maybe you had something to hide.
Ben Yelin: [00:23:27] Yeah. Certainly, in a civil proceeding, that could be an inference that you might have something to hide.
Dave Bittner: [00:23:31] Right.
Ben Yelin: [00:23:32] And that would probably reflect poorly on you.
Dave Bittner: [00:23:33] OK.
Ben Yelin: [00:23:34] But your everyday communications, your communications with staff, like, communications on things that are nonessential to the operating functions of your small business, so like...
Dave Bittner: [00:23:44] Right, right. Hey, I'm getting lunch. Anybody want anything? Yeah.
Ben Yelin: [00:23:47] Exactly. Like, there aren't going to be federal data retention policies on things like that.
Dave Bittner: [00:23:51] All right.
Ben Yelin: [00:23:52] So it's all about paying attention to the contents of the email, which would be the determining factor as to whether you should make a priority to retain it.
Dave Bittner: [00:24:01] All right (laughter). It's complicated.
Ben Yelin: [00:24:03] It is complicated. I say, when in doubt, don't delete it. Most of our email servers now have the capacity to hold a large number of emails.
Dave Bittner: [00:24:12] Yeah.
Ben Yelin: [00:24:12] So unless they're really trying to hide something, probably the safest bet is to always retain your records as a small business owner.
Dave Bittner: [00:24:19] Yeah. All right. Well, again, thanks to our listener for sending that in. We would love to hear from you. If you have a question for us, you can call and leave your question at 410-618-3720. That's 410-618-3720. You can also email us an audio file of your question. Just give us your name, where you're calling in from and your question, and we may answer it on the air. Coming up next, we've got my interview with former U.S. Secretary of Homeland Security Michael Chertoff.
Dave Bittner: [00:24:47] But first, a word from our sponsors, KnowBe4. So let's return to our sponsor KnowBe4's question - how can you see risk coming, especially when that risk comes from third parties? After all, it's not your risk, until it is. Here's Step 1 - know what those third parties are up to. KnowBe4 has a full GRC platform that helps you do just that. It's called KCM, and its vendor risk-management module gives you the insight into your suppliers that you need to be able to assess and manage the risks they might carry with them into your organization. With KnowBe4's KCM, you can vet, manage and monitor your third-party vendor's security risk requirements. You'll not only be able to pre-qualify the risk; you'll be able to keep track of that risk as your business relationship evolves. KnowBe4's standard templates are easy to use, and they give you a consistent, equitable way of understanding risk across your entire supply chain. And as always, you get this in an effectively automated platform that you'll see in a single pane of glass. You'll manage risk twice as fast at half the cost. Go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. Check it out.
Dave Bittner: [00:26:10] And we are back. Ben, I recently had the pleasure of speaking with Michael Chertoff. He is the former U.S. secretary of homeland security, and now he heads up the Chertoff Group. Here's my conversation with Michael Chertoff.
Dave Bittner: [00:26:23] As secretary of homeland security, you had a unique perspective on the terrorist threats that our nation faced. How do you think those threats have evolved in the time since you left office?
Michael Chertoff: [00:26:34] We still have, of course, violent jihadi groups that want to carry out and do carry out terrorist attacks. We've done, I think, a good job of preventing large-scale attacks and particularly attacks launched by having people come from around the world into the U.S. as part of a big plan. That's not to say that we're out of the woods yet on that, but I think we've done a fair bit to reduce the risk. As a consequence of that, the jihadi groups move to smaller-scale attacks or simply inspired attacks where they use social media or other kinds of communications to induce people to carry out an attack with a gun or an automobile or a homemade bomb. Those are harder to stop. Obviously, the scale is less than a 9/11, but it still can result in death and destruction.
Michael Chertoff: [00:27:26] Add to that the fact that we've seen a significant increase in extremist ideologies that are engaged in terrorist activity. We see it in shootings in churches, mosques and synagogues. Again, a lot of this has been promoted over the internet with recruiting and inspiration and, again, very difficult to prevent because often the people involved are not dealing with a lot of others. They simply have an online relationship with somebody, or they're inspired by a video on YouTube, and then they go off and carry out an attack. And we saw that, for example, in New Zealand, or we saw it at the synagogue in Pittsburgh. And this is, I think, an increasing problem.
Dave Bittner: [00:28:07] What about the efforts we've seen when it comes to influence operations? I mean, obviously, we had the interference with the 2016 elections. We see reports that the bots and the folks who are looking to have those sorts of influences are spinning up again for the next round of elections. Were those kinds of things on your radar when you were homeland security secretary?
Michael Chertoff: [00:28:28] Well, actually, when I was there, which was 2005 to 2009, we weren't really dealing with disinformation at the scale of what we have now. You go back 50 years or 75 years, the Soviet Union engaged in disinformation, where they used to call active measures, but that was, of course, in the pre-internet era, and it was kind of clumsy and almost laughable in many respects, although sometimes, in some parts of the world, they were quite successful.
Michael Chertoff: [00:28:54] What we've seen, though, since I left government is a significant increase in these operations, partly reflect the increased tension between the West and Russia, partly a result of the 2014 Ukraine Maidan Revolution, which then led to the occupation of Crimea and the escalation of conflict between Russia and Ukraine, and partly dealing with the explosion of social media, which has created more opportunities for targeted, sophisticated information operations designed to create dissension, confusion and distrust.
Dave Bittner: [00:29:33] Where do you come down on the social media platforms? What sort of insights do you have when it comes to things like content moderation and the control of bots?
Michael Chertoff: [00:29:41] Well, I think, first of all, platforms ought to be working hard to control botnets that are used to manipulate search engines or manipulate whether something is popular or not because those are just efforts at deception. And frankly, they even undercut the interests of the social media platforms themselves. Content moderation is a more complicated issue because we do have free speech. It's an important value, and with the exception of some very narrow categories, our general approach is to say, look - the answer to falsehood is truth. So I think we want to be careful to avoid a regime of censorship with respect to content. But I would say that disclosure of who's actually putting up content, identifying when people are misrepresenting their identity and dealing with these artificial manipulations are something that the platforms ought to take responsibility to do.
Dave Bittner: [00:30:41] Where do you suppose we are when it comes to that balance between security and privacy? I'm thinking about technologies like facial recognition and some of those things that are on the horizon.
Michael Chertoff: [00:30:52] You know, facial recognition can be valuable, for example, useful when you try to open up your phone, and your face appears and the phone opens up. The question is, what happens with the data? And I think, increasingly, we need to think about the issue of privacy not just in terms of what gets correct but how the data is controlled. There may be uses for facial recognition that are perfectly appropriate, but you want to make sure they don't migrate over to something that would be very inappropriate or threatening. You might want to have facial recognition, for example, to get you access into your apartment or into your place of business, but you wouldn't necessarily want that to be transmitted to the government and be used as a way of surveilling what you do out on the street every single minute of the day.
Michael Chertoff: [00:31:42] So this is about making sure that there is a degree of control over data that's generated so that people aren't putting in a all-or-nothing situation where either they don't participate at all on these internet activities or they wind up basically surrendering their private interests to commercial interests or government.
Dave Bittner: [00:32:03] Where do you come down when it comes to the so-called crypto wars, the whole notion of warrant-proof encryption?
Michael Chertoff: [00:32:10] Well, warrant-proof encryption, what it really means is this - if the entity which receives the warrant doesn't have the key to decrypt the communication. And I understand that that's a heartache for law enforcement because often they'd like to see the content of a conversation. But the only way to make that generally available under current technology would be to weaken the encryption itself or to create a back door or some kind of duplicate key. And the problem is that would be a weakness - not just limited to people who are engaged in bad activities; it would be for everybody. And so if you wound up with a duplicate key getting stolen or there's a weakness getting discovered, all of a sudden, all your encrypted data would now be available to criminals or foreign adversaries. And I think the security downside of that exceeds the upside of being able to decrypt a particular conversation.
Michael Chertoff: [00:33:07] I'd also point out this. The government can have laws against encryption or very good encryption on, you know, well-known commercial platforms, but there will always be, available in the dark web, tools that you can use to encrypt conversations that will not be breakable, and those will be the ones that criminals go to. So in the end, all you'll do is weaken security for everybody, and you won't really have much of an impact on any smart criminals.
Dave Bittner: [00:33:37] What is your sense of how well we're doing as a nation in response to these threats? Are we in a situation where we're nimble enough to respond to them?
Michael Chertoff: [00:33:46] I think we are slowly awakening to some of the challenges we've talked about on privacy, on balancing security with encryption, on disinformation campaigns. So you're beginning to see legislation being passed in some of the states. Congress is beginning to propose things. But I will acknowledge that we've been somewhat slow off the mark, and it took a pretty dramatic set of events - like, for example, what happened in the 2016 election - for people to wake up and say, we better get on top of this problem.
Dave Bittner: [00:34:19] All right. Ben, what do you think of that?
Ben Yelin: [00:34:21] Well, first, major thanks to the secretary for coming on our humble podcast.
Dave Bittner: [00:34:26] (Laughter) It's very nice of him to make the time for us.
Ben Yelin: [00:34:29] Absolutely. He is obviously someone with a distinguished career, and I'm really glad that he was able to join us. A couple of things stuck out to me. The first is how he talked about the evolution of terrorist groups from the type of large-scale attacks we had 9/11, post-9/11. They were perpetrated by groups with a very top-heavy structure and instructions were given sort of in a hierarchical manner, to now these smaller-scale attacks where people are using social media or other types of communications to induce people, people who are susceptible to being influenced, to commit smaller-scale attacks - so mass shootings, knife fights, et cetera.
Ben Yelin: [00:35:07] What's interesting about that is a lot of our counterterrorism tools were developed to counter the threats that existed after 9/11, particularly some of the surveillance tools. When we've talked about call detail records, that was really important when we wanted to do call chaining to figure out who was talking to whom because we want to figure out whether there was a connection between a relatively low-level person and the senior leadership of al-Qaida. That's going to be very valuable information in that context.
Dave Bittner: [00:35:34] Right.
Ben Yelin: [00:35:34] But now, you know, those connections are less important, where it only takes, you know, one encrypted communication, one YouTube video to inspire extremism across the globe. So I think it's going to - and I think he spoke well to this - it's going to take a while for our legal regimes to adapt to the new reality, the new threats that we face in the present day.
Dave Bittner: [00:35:56] What about his take on the crypto wars? I have to say I think I agree with his perspective.
Ben Yelin: [00:36:01] Yeah, I was actually sort of surprised. You know, generally, somebody who's been in the federal government - you know, obviously, his service at the Department of Homeland Security predated some of the major crypto war cases we've seen, like Apple's battle with the FBI in the wake of the San Bernardino terrorist attack. But it was a pretty well-spoken and nuanced take on the danger of giving the government sort of a duplicate key or a back door into these encrypted devices and how the costs of doing that both in allowing the government to access private data and potentially opening that door for bad actors, cyber criminals or foreign adversaries, that ends up outweighing the benefit of getting information that's on a single device.
Ben Yelin: [00:36:44] And I think, to be honest, that's not something I would have expected to hear from somebody who led the Department of Homeland Security. I would be more surprised perhaps once you're out of the federal government - you've had a few years to digest your service. I think he's - you know, he was secretary up until 2009. So we're all getting old here. It's been 10 years.
Dave Bittner: [00:37:02] (Laughter).
Ben Yelin: [00:37:03] You might have more latitude to sort of take a step back and think about what the broader consequences of these policies are because, I mean, the official position of the Justice Department in both the Obama and Trump administrations has basically been as many back doors as we can get, we want them. We don't want encryption to prevent us from being able to solve either terrorist threats or garden-variety crime. So I thought it was really interesting to hear him have that perspective.
Dave Bittner: [00:37:28] Yeah.
Ben Yelin: [00:37:28] I think another point that stuck with me is how we are slowly awakening to a lot of these challenges. I think it did take something like the 2016 election and the Mueller investigation and everything we've found out since then to recognize this challenge of disinformation on social media. I think it just was sort of off of our radar, even though the problem, as he said, has existed for a long time and has existed in its current capacity for several years, dating back to at least 2014, 2015. So I thought that was interesting as well.
Dave Bittner: [00:38:00] It really struck me that it seems like a lot of his views are really coming from a practical point of view.
Ben Yelin: [00:38:06] Yeah. He has a unique perspective because he's been on the inside, but he's also spent the last 10 years as an analyst. So I think he can come at it from, you know, wearing those two different hats. But I'm cool with whatever he says as long as he says it on our podcast.
Ben Yelin: [00:38:20] But I was particularly pleased to hear that he had more civil-liberties-friendly takes than perhaps I would've anticipated.
Dave Bittner: [00:38:28] Right.
Ben Yelin: [00:38:28] So I think it's sort of eye-opening. And you know, the broader lesson here is that I think there's more widespread recognition that cuts across political ideologies of the evolution of threats against us. We are no longer dealing with the threats that existed in the post-9/11 world, even though, you know, technically, we still have the authorization for the use of military force in Afghanistan. Technically, the Patriot Act is still in effect. But just because those policies are there doesn't mean that the threats have remained static.
Ben Yelin: [00:39:00] And so I think sometimes it's important for all of us to step back and be like, OK, what is really a problem now? And I think he was really apt in talking about smaller-scale terrorist attacks inspired online, inspired by extremists using online platforms, using algorithms, using social media manipulation and the spread of disinformation. And this is the kind of stuff that keeps me up at night.
Dave Bittner: [00:39:24] Yeah.
Ben Yelin: [00:39:24] That and my baby. But those two things - the things that keep me up at night.
Dave Bittner: [00:39:29] (Laughter) Right, right, right. Well, again, we want to thank former U.S. Secretary of Homeland Security Michael Chertoff for joining us. He is now the head of the Chertoff Group. And I also want to send out a thanks to everyone on our staff here at the CyberWire for making that interview happen and putting together all the details, dotting the i's and crossing the t's to make that possible - a really great crew here. And that is our show.
Dave Bittner: [00:39:52] We want to thank all of you for listening, and of course, we want to thank this week's sponsor, KnowBe4. Go to knowbe4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. Request a demo and see how you can get audits done at half the cost in half the time.
Dave Bittner: [00:40:12] Our thanks to the University of Maryland's Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: [00:40:34] And I'm Ben Yelin
Dave Bittner: [00:40:35] Thanks for listening.
Copyright © 2020 CyberWire, Inc. All rights reserved. Transcripts are created by the CyberWire Editorial staff. Accuracy may vary. Transcripts can be updated or revised in the future. The authoritative record of this program is the audio record.
KnowBe4 is the world’s largest security awareness training and simulated phishing platform that helps you manage the ongoing problem of social engineering. Their new school security awareness training platform is user-friendly and intuitive. It was built to scale for busy IT pros that have 16 other fires to put out. Learn more at KnowBe4.com.