The practical concerns of data encryption.
Tommy Ross: The way that encryption is being used today is different than the way it was used 10 years ago, but also, the types of encryption that are out there are different and more diverse.
Dave Bittner: Hello, everyone, and welcome to another episode of "Caveat." This is the CyberWire's law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hi, Dave.
Dave Bittner: On this week's show, Ben follows developments on the Clearview facial recognition story that The New York Times recently broke. I've got a story about the government compelling some tech giants to help find a WhatsApp drug dealer. And later in the show, my conversation with Tommy Ross - he serves as senior director of policy at BSA, the Software Alliance, and we're going to be discussing encryption and law enforcement access to data. While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be right back after a word from our sponsors.
Dave Bittner: And now some reflections from our sponsors at KnowBe4. What's a policy? We've heard a few definitions. Some say it's an elaborate procedure designed to prevent the recurrence of a single nonrepeatable event. Others say it's a way the suits play CYA. Still others say it's whatever happens to reside on those binders the consultants left behind them right before they presented their bill. How about a more positive approach? As KnowBe4 can tell you, better policy means better security, and getting the policies right is a big part of security. So is setting them up in ways that your people can actually follow. We'll hear later in the show how you might approach policy.
Dave Bittner: And we're back. Ben, why don't you kick things off for us this week? The continuing story of the folks at Clearview got some new developments this week. What do you got for us?
Ben Yelin: Yes, the never-ending saga of Clearview AI.
Dave Bittner: (Laughter).
Ben Yelin: We're on episode three of discussing this story, and I think that's a worthwhile thing to do because it's a pretty big development in digital privacy. So a couple weeks ago, New York Times reported on this company, Clearview AI, that was scraping data from - or currently is scraping data from a bunch of social media platforms and other websites like Google. It allows the matching of, you know, potentially, photographs that law enforcement or anybody else takes on the street to publicly available images that have been scraped from those sites.
Ben Yelin: And this set off a controversy. We had federal lawmakers writing letters to Clearview AI and proposing that, you know, new federal privacy statutes get passed in the wake of this expose. And I know we talked about it last week. The first lawsuit against this company was filed in a federal court in the state of Illinois. The newest development is that a couple of these companies - particularly Google and YouTube, but also Twitter - have sent cease-and-desist letters to Clearview AI and its CEO. And the cease-and-desist letters claim that Clearview AI's practice of scraping violates these websites' or these applications' terms of service. And it was very interesting to hear the CEO's response to that, which we actually have in an audio clip from an interview he did on CBS News.
0:03:28:(SOUNDBITE OF TV SHOW, "CBS THIS MORNING")
Errol Barnett: Twitter has sent you a cease-and-desist letter, and they've gone as far as to say they want you to delete the data you've collected and to pull it back from any third parties who have received it. I've seen Twitter results in what we just searched. So are you aiming to comply with Twitter, or what is your response to what they're saying?
Hoan Ton-That: You know, we've received their letter, and our legal counsel has reached out to them and are handling it accordingly. But there is also a First Amendment right to public information. So the way we have built our system is to only take publicly available information and index it that way. So that's all I can say on the matter.
Errol Barnett: So as far as Twitter is concerned, despite their request that you stop using their platform for your service, you're dealing with it legally.
Hoan Ton-That: Yes.
Errol Barnett: And you believe you have a First Amendment right...
Hoan Ton-That: Completely.
Errol Barnett: ...To access what's on their...
Hoan Ton-That: Yes.
Dave Bittner: All right, Ben. First Amendment right here - unpack that, please.
Ben Yelin: So this is sort of like when somebody gets in trouble at school and - you know, for mouthing off of the teacher, and they say, oh, well, I have a First Amendment right to do that. You can't...
Dave Bittner: Right.
Ben Yelin: ...Put me on timeout.
Dave Bittner: (Laughter) OK. I see that argument come up on Facebook a lot too. You know, someone deletes a post or something, and people will say, this is my First Amendment right to...
Ben Yelin: First Amendment rights, yeah. So your first reminder is that the first words of the First Amendment are that Congress shall make no law. That means that this is a right against the government curbing your free speech. So in terms of Twitter sending a cease-and-desist letter, from a First Amendment perspective, there's nothing that's binding Twitter or any of these other tech companies from trying to prevent these scraping practices.
Dave Bittner: This is a B2B exchange. The government doesn't have really much to do with it.
Ben Yelin: It does not.
Dave Bittner: OK.
Ben Yelin: So the government's not involved at this point. Now, beyond a cease-and-desist letter, we could get to the point where some of these companies try and get an injunction to, you know, get the courts involved and to get Clearview AI from ceasing to scrape this data.
Dave Bittner: OK.
Ben Yelin: And things would actually get a little bit more complicated. I wouldn't necessarily call it a First Amendment issue. I think that's sort of an absurd framing that he's using because people know what the First Amendment is.
Dave Bittner: OK.
Ben Yelin: I think it gets to an interpretation of a federal statute - the Computer Fraud and Abuse Act. So that law basically prevents unauthorized access to data. It's an anti-hacking law. So it prevents me from trespassing into somebody else's digital data and stealing it.
Dave Bittner: Right.
Ben Yelin: There was sort of a lack of understanding on how the Computer Fraud and Abuse Act in general would apply to scraping - you know, this practice of taking publicly available information from these sites. And we actually got a little bit of clarity on that in a case in - that we actually talked about on this pod, I believe - hiQ Labs v. LinkedIn.
Dave Bittner: Yeah.
Ben Yelin: And in that case, the 9th Circuit Court of Appeals based in California ruled that automated scraping of publicly accessible data probably does not violate the Computer Fraud and Abuse Act. Now, I'm not sure how widely applicable this decision is. First of all, let's say, it's an appeals court decision, so there wasn't any sort of nationwide injunction involved. This isn't the Supreme Court talking.
Dave Bittner: Right.
Ben Yelin: But this potentially could be precedent-setting if it's adopted by other courts. Now, the court in that case has not considered something to the scale or to the magnitude of what Clearview AI is doing. Yes, that case addressed automated scraping but not to sort of the degree that we're seeing it here. So that's where Clearview AI might run into trouble.
Dave Bittner: How much do these tech companies like Twitter and Google - how much legal backup do they have when it comes to their terms of service? In other words, if they're saying, if you use our service, you agree not to scrape our data, is that all they need to say, and then that's that?
Ben Yelin: Yes. So for the most part, that's absolutely true.
Dave Bittner: OK.
Ben Yelin: Now, because these are - have become public platforms, you know, there's a little bit of wiggle room there that we've talked about on previous podcasts. More interestingly, those are user terms and services. So scraping gets into that sort of nebulous territory where the company doing the scraping is not necessarily using the service. They're not Twitter users themselves, just as that hiQ company who was scraping information off LinkedIn - they never signed - you know, agreed to anything agreeing to LinkedIn's terms and services because it's not like they were just setting up a LinkedIn account.
Ben Yelin: I think that's where things get a little bit nebulous. And if our only guidance is this past case in the 9th Circuit, then I sort of think the CEO of this company, while he was misguided in using First Amendment language, might have a point. He might be able to plausibly claim that he can't be prosecuted under the Computer Fraud and Abuse Act simply for engaging in scraping of publicly available information even if that violates the user's terms and services, if that makes sense.
Dave Bittner: Yeah. What are the odds that this comes down to a our-lawyers-are-bigger-than-your-lawyers situation, where the might of Google and Twitter go up against comparatively small company like Clearview and come at it that way?
Ben Yelin: Clearview AI is not going to be alone in answering this lawsuit. A couple of advocacy groups have stood up for companies like hiQ and probably will stand up for companies like Clearview AI because they believe in the value of public information, and they believe that there could potentially be a chilling effect on researchers, on media if there's sort of a blanket ban on scraping.
Dave Bittner: I see.
Ben Yelin: So for instance, in the hiQ case, the Electronic Frontier Foundation wrote a amicus brief, a friend of the court brief, arguing on behalf of hiQ. And I certainly think that they would get involved in this case in some capacity, as would the ACLU. And so while, you know, Clearview AI, I'm sure, you know, with some of their investment capital and, you know, considering how well this company's done, could probably afford a decent lawyer - even though they're going up against these tech giants, they're going to have a pretty robust team on their side. And there are a lot of good data privacy lawyers out there. So I think it's not going to be one of those cases where Twitter can just bully them out of court.
Dave Bittner: I see. Well - and then I suppose also there's the factor that this could potentially spark some regulation on this sort of thing because of all of the attention that it's getting in the - that court of public opinion.
Ben Yelin: Absolutely. So we've already had two United States senators write letters, as I mentioned, complaining about Clearview AI. One thing that they could do immediately, if Congress was inclined to act, is amend the Computer Fraud and Abuse Act to outlaw this type of scraping. You know, a federal statute would go a long way. We've had the 9th Circuit claim that in their interpretation of the current Computer Fraud and Abuse Act, this type of scraping of public information is legal. Congress can say, no, it's not, and then that case would come out very differently. Now, if it got to constitutional claims, of course, there are limits on what Congress can do.
Dave Bittner: Right.
Ben Yelin: But because I'm doubtful of the First Amendment arguments, I think Congress could actually quite reasonably step in here and add some clarity to the Computer Fraud and Abuse Act.
Dave Bittner: Well, it's an interesting continuing story, and we will continue to follow it. I think it's fascinating. So stay tuned. We'll see how this one plays out.
Ben Yelin: Yeah. We might have to do our Clearview AI part four update on the next podcast.
Dave Bittner: (Laughter) Right. Right. All right. Well, my story this week comes from Forbes, written by Thomas Brewster. And it's titled "Feds Order Massive Number of Tech Giants to Help Hunt Down One WhatsApp Meth Dealer." This is very interesting. It has to do with the government - I believe specifically the DEA - making requests from Google, and WhatsApp and a number of these large tech companies. And they call them pen traps, which is where they're looking for broad metadata on the communications.
Dave Bittner: Let me back up a little bit. So there's this person that the DEA is trying to track down who they allege is a meth dealer, and they're trying to use this person's use of social media, asking for all kinds of information to try to get more information on this person of interest, right? But a lot of the information they're asking for would be outside of the normal types of things that these tech companies would be able to provide. What's your take on this, Ben?
Ben Yelin: So it's a very interesting case. So there's this very high-profile Mexican meth dealer. The government has been very interested in this person. And as you said, the government went to WhatsApp and, using a subpoena, demanded that WhatsApp hand over all of these subscriber details. And, well, this is all metadata. Sometime this metadata can be revealing, and sometimes it can be almost as useful as content. If you're finding out which phone numbers this person is calling, that can obviously implicate other people...
Dave Bittner: Right.
Ben Yelin: ...In the commission of a crime.
Dave Bittner: So just for clarity's sake, the metadata is, in other word - not the contents of the call, not the contents of the message, but all that surrounding content...
Ben Yelin: Exactly. So it's...
Dave Bittner: ...The time, the duration...
Ben Yelin: Right.
Dave Bittner: ...That sort of stuff.
Ben Yelin: It's data about data.
Dave Bittner: Yeah.
Ben Yelin: And you know, the way we usually try and think about it in a non-digital sense is that's the routing information that would be on the front of the envelope rather than the letter that is in the envelope.
Dave Bittner: OK, sure.
Ben Yelin: The DEA is being very aggressive. So not only did they issue that demand from WhatsApp; they also sent a letter to WhatsApp asking what WhatsApp would not provide. And, you know, WhatsApp, which prides themselves on protecting encrypted information, gave the DEA and the federal government a list of digital items that they would simply not provide because that's information that they do not collect.
Dave Bittner: Oh.
Ben Yelin: And the government, in some ways, probably knows that it's asking for data that WhatsApp doesn't have and it's data that they will never actually receive. Now, not only did the DEA try to get this information from WhatsApp, but they also went to Google and a bunch of other telecommunications providers...
Dave Bittner: Yeah.
Ben Yelin: ...And even used language as broad as to say, any other provider or any wire or electronic communications services, to get sort of any data they could obtain, any details on some of the accounts tracked by Facebook's encrypted Messenger application.
Dave Bittner: How does that possibly work?
Ben Yelin: Yeah, good time to step back there. Yeah.
Dave Bittner: (Laughter) How does it possibly work? If the DEA says any - basically, hey, anybody, anybody out there, is it my responsibility as a provider of these services to be constantly scanning for potential DEA orders that don't call me out by name?
Ben Yelin: Well, presumably, you'd only begin to search, as one of these companies, if you received a lawful subpoena.
Dave Bittner: I see (laughter).
Ben Yelin: But yeah. I mean, I thought that language was hilarious in how overbroad it is. At least one of the reasons that's been speculated that this potential search was so overbroad is the DEA doesn't want to have to go back - you know, once they try and piece together some of this meth dealer's contacts, his whereabouts, they don't want to go back and separately ask for new data in a bunch of new legal filings. I think that would be burdensome to them. They want to have sort of the legal authority to collect all the information they can, including information that they don't yet know that they want or need.
Ben Yelin: The standard for probable cause is based on a 30-year-old Supreme Court case, Illinois v. Gates. And in order to get a probable cause determination, you have to have something that's substantiated based on the totality of the circumstances. Generally, that doesn't mean you have to be more certain than not that the information that you're seeking as part of a search warrant or any application for information has to be exact or even more likely than not that you're going to find something.
Ben Yelin: It's a relatively low standard, and I'm just not sure that's well-suited for the digital age. I'm not sure the standards are high enough because the amount of information that could potentially be available on a single user is so limitless that it's hard to imagine getting, you know - a court approving such a blanket order when there's no specificity alleged within each particular application. It's not like the government knows that he said X in Y Facebook communications.
Dave Bittner: Right. Right.
Ben Yelin: So it's sort of one of those throwing stuff at the wall to see what sticks and covering all their bases before they know exactly what information they need to obtain, and I think that could potentially be dangerous. I mean, if you applied this in other cases, you could see these types of blanket orders, where there's sort of an open tab on information that can be collected about a user, and it doesn't require the government to do real law enforcement work, which is actually alleging that there's an important piece of information contained in a discreet communication.
Dave Bittner: Right. A couple interesting notes in this article - they point out that it's not the first time that the government has asked a bunch of tech providers to supply people's information in one fell swoop. There was another of these pen traps that Forbes covered that a San Francisco court signed off on last year, and it listed over 50 telecom companies to hand over user data, but several of the telecom companies they listed were no longer in business.
Ben Yelin: Yes.
Dave Bittner: Right.
Ben Yelin: They mentioned Adelphia, I think, which is a company I haven't really heard anything about in the last 10 years.
Dave Bittner: Right.
Ben Yelin: And I actually didn't realize until I read this article that they no longer existed. But yeah.
Dave Bittner: Kind of hurts your case when your - attention to detail in your request, I suppose.
Ben Yelin: Yeah. And that's what's generally problematic about these future-oriented broad search requests, is, it's potentially limitless to the extent that you could be requesting information from companies that no longer exist in the next, you know, two or three years.
Dave Bittner: Yeah.
Ben Yelin: And that, you know, goes beyond, potentially, the life of this crime and, if this person is ever apprehended, the life of this investigation.
Dave Bittner: The other thing that this reminded me of - we actually covered a story over on the "Grumpy Old Geeks" podcast which was about Google instituting a nominal fee to provide certain types of information to law enforcement. And what we were reading between the lines was that basically, this was Google trying to put some rate limiting on this sort of thing, to say, OK, law enforcement, you know, this is a burden for us to be able to provide this information; it costs us money to do it, so we're going to charge you not a whole lot of money, but some money. So you have to think twice before just making these blanket requests because it's going to hit you in the pocketbook.
Ben Yelin: Yeah. I mean, it's sort of like - you hear policy proposals for a financial transaction tax, and those aren't designed to prohibit people from engaging in financial transactions. It's designed to protect against automatic transactions that just go on in perpetuity and go undetected.
Dave Bittner: Right.
Ben Yelin: And I think that a similar principle is at play here. Now, I don't know, you know, if law enforcement would ever go for something like that. I mean, certainly, in the course of normal police investigations, any third-party holder of data is probably not going to be able to charge law enforcement to gain access to that data. That's sort of inherent in doing business. So I don't know the particulars of that, but I don't see that being a, you know, broader solution to this problem of overbroad search demands of these tech companies.
Dave Bittner: Another interesting one we'll have to keep our eye on to see how this plays out. It is time to move on to our Listener on the Line.
0:19:03:(SOUNDBITE OF PHONE DIALING)
Dave Bittner: This week's Listener on the Line actually sent us an email. This is a listener named Evan (ph) and wrote - and with some interesting information that they wanted to share. They said, hey, y'all, longtime fan of the CyberWire - been loving the content of the "Caveat" podcast. Thanks for all your hard work. Well, Evan, you're welcome. He says...
Ben Yelin: We love you too, Evan.
Dave Bittner: (Laughter) Evan says, one of the comments by the co-author of The New York Times article made me reflect on something I've been battling in the tech industry for years, and that's proper documentation. His comment on more transparency from companies and data flow diagrams is key to progression in this aspect. One thing I would like to note is that the health care industry is taking this step forward. Have you ever heard of the MDS2 document, Ben?
Ben Yelin: I have not until I read this email to us, and Evan helpfully included links.
Dave Bittner: Yeah. It's a series of questions for medical device manufacturers to let their customers know the state of cybersecurity of the device they have in the market. One of the documents they reference is a data flow diagram for the system. The MDS2 documentation helps bring data to the forefront of the manufacturers' minds and places more responsibility on them to let their customers know how patient data is handled. And Evan helpfully sent us a link to this document, which we will also include in our show notes.
Dave Bittner: Evan says the central theme here, though, is that this documentation comes from industry and is not federally instituted. If Congress is to make it anywhere with privacy laws, they need to take note of what the private sector has been doing for years and collaborate to create legislation that makes sense. If you have congressmen that aren't tech-savvy, these types of bills will end up in the recycle bin by lunch.
Ben Yelin: Congressmen can be not tech-savvy?
Dave Bittner: (Laughter) Hard to imagine.
Ben Yelin: I'm floored.
Dave Bittner: (Laughter) Hard to imagine, I know.
Ben Yelin: How dare you suggest that, Evan?
Dave Bittner: Well, yeah (laughter). Ben, what do you make of this? I found this pretty interesting.
Ben Yelin: It's very interesting and very helpful. The private sector usually does end up taking the lead, largely because they're forced to. Health care organizations, hospitals, doctors' offices - they need their information protected, and they need to keep patient information private. You know, that's their obligation under HIPAA laws. And in the absence of any sort of federal regulation, in order to make sure that their information is protected, they're going to be relying on innovation in the private sector.
Dave Bittner: Right.
Ben Yelin: And the private sector is sort of always going to be leading the way here, largely because of Congress's inertia. And it's not necessarily Congress' fault. I mean, it's difficult to pass legislation. It's difficult to anticipate problems that might arise in the future. And that's what's so good about having a dynamic and competitive private sector that can come up with these solutions in the first place. And I think just having this level of transparency so that companies know exactly how their data is protected and is going to be protected is extremely valuable.
Dave Bittner: Right. And having it standardized in that way - in other words, the - this comes from NEMA, the National Electrical Manufacturers Association. So I suppose for them to be able to say to their members, hey, everybody, we're all in agreement; this is a standard we're going to abide by in terms of how we communicate what we're up to - that's a good thing.
Ben Yelin: Right. And we see that all the time as it relates to standard-setting. The government itself has standard-setting organizations like NIST.
Dave Bittner: Right.
Ben Yelin: But while NIST, you know, is more efficient at developing standards, including the NIST framework, than the federal government is at passing data privacy legislation...
Dave Bittner: Yeah.
Ben Yelin: ...The private sector oftentimes is ahead of NIST, and they're developing their own standards that can be adopted voluntarily across the industry, and I think that's something that's very valuable and useful. And, you know, sometimes the private sector, because of its institutional expertise and because, you know, it's probably, you know - companies can outcompete one another to have the best, most robust standards and, you know, to offer the best services to their customers, oftentimes their solutions can be better than the government's ones.
Dave Bittner: And also, I guess, important to note that this is also in their self-interest, that they may be doing these sorts of things to head off some sort of government regulation.
Ben Yelin: Absolutely. And that's frequently what private standard-setting organizations do in the first place.
Dave Bittner: Yeah.
Ben Yelin: Sometimes, you know, they think that any potential congressional statute or federal regulation is going to be more burdensome on them than voluntary standards that they might already be abiding by. So you're right. It's - this really could be a way to preempt what Congress might do in the future. And I think it would behoove Congress to look at some of these private standard-setting companies and organizations and take their guidance when they try and come up with their own regulated data privacy practices.
Dave Bittner: All right. Well, again, thank you, Evan, for sending that in. We do appreciate it - interesting stuff, indeed. And we would love to hear from you. We have a call-in number for "Caveat." It is 410-618-3720. You can call and leave your question. We would love for you to send us an audio file. You can email that to email@example.com. Also, if you'd like to email us your question, we will answer that on the air as well. Coming up next - my conversation with Tommy Ross. He is the senior director of policy at BSA, the Software Alliance. We're going to be discussing encryption and law enforcement access to data.
Dave Bittner: But first, a word from our sponsors - and now we return to our sponsor's point about policy. KnowBe4 will tell you that where there are humans cooperating to get work done, there, you need a common set of ground rules to ensure that the mission is accomplished but in the right way. That's the role of policy. KnowBe4's deep understanding of the human dimension of security can help you develop the right policies and help you train your people to follow them. But there's always a question of showing that your policies are not only sound, but that they're also distributed, posted and implemented. That's where the policy management module of their KCM platform comes in. It will enable your organization to automate its policy management workflows in a way that's clear, consistent and effective. Not only that - KCM does the job at half the cost in half the time. It's your policy, after all, implemented in a user-friendly, frictionless way.
Dave Bittner: And we are back. Ben, I recently had an interesting conversation with Tommy Ross. As I mentioned, he is the senior director of policy at BSA, the Software Alliance. And we talked about encryption and law enforcement and how they access data. Here's my conversation with Tommy Ross.
Tommy Ross: We're kind of stuck in a cycle that has repeated itself several times over the years in which the conversation is really focused on how law enforcement can either mandate or compel industry voluntarily to implement exceptional access to encryption in order to help law enforcement access digital evidence. And obviously, that's a proposition that has proved to be unattractive in the past and I think has generally just resulted in a stalemate, and I think that's what we're seeing now. And I think it ultimately misses a big opportunity because digital evidence as a forensic discipline is, No. 1, evolving very quickly. And No. 2, it takes into account a lot of considerations that go well beyond encryption. And we're missing the opportunity to make progress in helping law enforcement address a lot of those considerations while we're focused on that stalemate.
Dave Bittner: Yeah. It seems to me like within this stalemate, the two sides are almost talking past each other, like there really isn't much give or take on either side. Is that accurate?
Tommy Ross: I think that's absolutely right, yes.
Dave Bittner: And why do you think there's no movement there? What's holding them back? Are they just standing their ground?
Tommy Ross: Well, I think there's a lot of things going on. I think there is some misunderstanding about the nature of the technology. I think you've seen in some of the proposals coming out from different government leaders, you know, relating to responsible encryption and that kind of thing, where there's this belief that industry can just sort of wave a magic wand and create exceptional access or backdoor access to encryption without somehow weakening encryption.
Tommy Ross: And I think many experts over the years have looked at that and determined that that's really not the case. But there is this kind of belief that industry could do it if it really wanted to. So I think that's part of the problem. I think another part of the problem is that the law enforcement community, in fairness, has looked at trends in relation to the adoption of encryption across a number of different communications platforms over the years and has seen some troubling signs about their ability to continue to access things that they've gotten really used to accessing.
Tommy Ross: I think, in addition to all that, there are political reasons that, you know, color this debate, including that, regardless of how much a priority from a technical standpoint addressing encryption should be, there are a lot of law enforcement officers around the country who are cheering the Department of Justice on when they take on this debate because it's something that there's some emotion around. So they have some supporters across different communities. And of course, this is seemingly a good time to focus criticism on the tech community, so I think there are those political undertones as well.
Dave Bittner: Do you see any movement from either side? I mean, where do you suppose this is headed?
Tommy Ross: I am actually optimistic that there is an opportunity for progress even in spite of the stalemate because I think even while the focus in the interagency process has been on encryption, there are efforts to start to look at other parts of the puzzle here. Representative Val Demings and a host of other members of Congress, both Republican and Democrat, introduced legislation two weeks ago, I think, that would establish some infrastructure in the Department of Justice to tackle digital evidence challenges more effectively. And I think that's really promising. It's really encouraging to see that it's bipartisan legislation. And I think it's a opportunity to dig into some of those challenges that I've been alluding to.
Tommy Ross: Also, I'll say that members of BSA, some of the companies that are part of BSA's membership, have been engaged in a series of dialogues with law enforcement officers at the state, local and federal level about some of these issues in ways that are not designed to be public or in the headlines but are really sort of working sessions behind the scenes. And I think we've seen a lot of interest.
Tommy Ross: I mean, first of all, those kinds of sessions illustrate how diverse the challenges are for law enforcement in terms of trying to access digital evidence. And they illustrate how many opportunities there are to move forward in - you know, with kind of win-win solutions that would make the process for law enforcement requesting access to digital evidence in the possession of technology providers run much more smoothly. The dialogues also show that there is not only an appetite but also an ability for both of these communities to get together behind closed doors and roll up their sleeves and really, you know, sit down and make progress on hashing out practical issues.
Dave Bittner: You know, I think, certainly, the high-profile cases are the ones that gather our attention and the ones that we hear about. Are there issues in the day-to-day operations, the folks who are doing day-to-day law enforcement tasks? Or can you give us some examples of some of the things where they're finding theirselves (ph) frustrated by things like encryption?
Tommy Ross: Yeah, I'd be glad to. Encryption is sort of one challenge, but I think there are all these other challenges out there that based on albeit some limited survey data - but based on some survey data from law enforcement officials around the country that seem to be bigger problems than encryption itself. And one of the big problems is that many law enforcement officials don't know where to go to get certain types of data or even what kinds of data to access.
Tommy Ross: And part of that challenge is not just a lack of training, but it really reflects the rapid evolution of the technological landscape. You know, you think about all the different platforms that are out there collecting data with the advent of the internet of things.
Tommy Ross: I mean, we're not just talking about cellphones records anymore. We're talking about home assistance. We're talking about GPS devices. We're talking about - you know, we've even seen court cases where law enforcement have been seeking data from smart energy meters. There are this wide array of data collecting devices out there. The types of data they collect are often obscure or at least obscure to the law enforcement community. And even if you know that there is data that you might want, it can be very difficult to understand how to frame the requests to get just the data that you want instead of a lot of data that might be irrelevant and might actually slow down your investigation.
Tommy Ross: So, for example, if you have a suspect for a crime and you want to understand their communications history with a certain group of people and you want to get their text messages and that kind of thing, if you go and you ask for their entire cloud backup of their phone, that's going to give you a lot of data that you have to sift through that is completely irrelevant to the specific communications you're looking for. And so law enforcement is really wrestling with not only how to train the law enforcement personnel across the country to understand what the data is and how to access it, but also trying to put in place mechanisms to sort of streamline that process and improve communications with providers so that they can work out the requests in real time, in ways that are consistent with legal protections and safeguards but also that ultimately speed up investigations.
Dave Bittner: Is part of this also having the tools and the knowledge to make your case in front of a judge - if you're trying to get a warrant to get some information, to put it in terms that a judge can understand?
Tommy Ross: Yes, absolutely. And I think there is a need for the kind of training that I'm talking about in education, but also sort of technical support, access to technical support, not just among criminal investigators but also prosecutors and judges and, really, everyone that's working throughout the criminal justice system. But certainly the development of warrants by criminal investigators - you know, putting in front of judges warrants that outline the specific data being sought, why it's relevant, the probable cause and all of that in clear and specific terms that judges can understand is a big challenge. And I think that is one of the challenges that we see most prominently causing frustration around some of these investigations.
Tommy Ross: I should also say, you know, it's not just a matter of - I don't want to oversimplify it. It's not just understanding what the data is and being able to access it better. There are a lot of legal and procedural issues - you know, the Carpenter decision from the Supreme Court last year that suggested that stored information around geolocation was content information for the purposes of the Stored Communications Act and would require a warrant. That's going to have ramifications for a lot of different types of information. And that's something where courts and the Congress will have to work through some additional challenges. Cross-border access is one that that Congress has been working on with the Cloud Act, but that is still being implemented, and it's very important because it involves another set of challenges.
Tommy Ross: And I think there are a range of others as well that sort of fall on the spectrum from the in-the-weeds investigative procedure to, you know, big legal and relationship issues between the technology providers, the law enforcement community and the community of consumers and users of the technology that are impacted by these laws.
Dave Bittner: So I know that recently you and your team there at BSA have actually introduced a framework for encryption and law enforcement access to data. What's in there? What do we need to know about that?
Tommy Ross: Yeah, we did, about a month ago or so, and it's available on our website. It lays out best practices for law enforcement, for policymakers and also for the tech community. And we thought it was important not just to focus on law enforcement but to recognize that all the stakeholders involved - policymakers, law enforcement investigators and others involved in the criminal justice system and the technology providers themselves - have important responsibilities to make the process involved in enabling investigative access to digital evidence work as smoothly and as quickly as possible, again, within the bounds of the safeguards that have been put in place to protect people's privacy and due process.
Tommy Ross: So we laid out a number of different best practices. When it came to law enforcement, there were best practices that were intended to get at some of the challenges that I was discussing earlier about, you know, developing warrants with as great a degree of specificity as possible, really targeting the requests in a way that not only is consistent with due process requirements but also enables technology providers to act on them quickly. Also, you know, looking at how law enforcement should approach cross-border cooperation, how law enforcement should ensure transparency around the processes that they use to access digital evidence and that kind of thing.
Tommy Ross: I think when it came to the technology providers, we feel like technology providers - you know, they have to meet a lot of obligations to their customers, many of whom are deeply concerned about the privacy of their information. And we think that serving their customer interests really is paramount for them, and that's appropriate. And so doing things like publishing transparency reports and notifying users when their information is requested within the bounds of the law, you know, those things are really important to maintain those relationships with their customers and to maintain that trust. On the other side, though, they can do more to enable law enforcement to act as quickly as they need to act in the case of a lot of investigations, including by doing things like establishing clear points of contact for law enforcement to be able to reach out when they really need to get ahold of people. That's often a big challenge, you know, when you're talking about law enforcement investigations of missing persons cases. There are really horrible statistics out there about how short the window is law enforcement will have to act if they hope to crack a missing person case after a person is abducted. And so they need to be able to get ahold of someone, with a technology provider in many cases, on a really rapid timeline.
Tommy Ross: And so that point of contact is important. Posting clear information about their policies with regard to what data is available and how to access it, how to request it. And doing things like participating in trainings of law enforcement - that's something that a number of our members have been involved with at various points. It can make a huge difference when law enforcement gets training directly from technology providers about what evidence is available, what information might be available and how to request it and how to do so in line with the policies that the company has put in place to protect the integrity of their relationship with their customers. We have heard anecdotal evidence of that kind of training directly impacting the ability of law enforcement to crack open really significant cases that they've worked on after receiving that training.
Tommy Ross: And I do think it's important to talk a little bit about encryption. And what I would say is that we've talked about there being a stalemate, and that's kind of accurate because in many ways, the broad contours of the policy discussion don't change. But I think it's also important to recognize that encryption, like the rest of the technology landscape, is really dynamic, and it evolves very quickly. And so the way that encryption is being used today is different than the way it was used 10 years ago, but also, the types of encryption that are out there are different and more diverse.
Tommy Ross: So, you know, we're seeing end-to-end encryption used for a lot of communications. There's been work on lightweight encryption for IoT devices, which is different than the types of encryption that you use on more sophisticated devices or on internet communications. There's the encryption for things like smartphones. But we're now looking at things like quantum-proof encryption to get ahead of the challenges that quantum computing will present to keeping information secure. We're looking at a really important role for encryption in 5G networks as they get put in place around the world. Because of the way that data is processed within the 5G environment, but also because of the massive volumes of data that we expect to be processed in 5G environments, ubiquitous encryption is going to be a cornerstone of security in that environment.
Tommy Ross: And so I think what I would just say to end is that it's really important that we not just sort of file away this conversation and call it a stalemate and move on, but that we are constantly looking at the shifting technological landscape and understanding all sides of the equation - recognizing that law enforcement has legitimate needs to access digital evidence for criminal investigations, but also understanding that encryption plays a lot of different roles across the security landscape and that those roles are changing. And we need to pay attention to all these things and try to work towards solutions, whether they involve changing how we do encryption or, you know, changing how we do other aspects of criminal investigation relating to digital evidence. We need to constantly be on the lookout for new solutions.
Dave Bittner: All right. Interesting conversation, huh, Ben?
Ben Yelin: Absolutely. Yeah, one of the more interesting interviews I think we've done - not to say all the interviews haven't been interesting.
Dave Bittner: (Laughter).
Ben Yelin: But one thing that struck out at me is the encryption wars are pretty ideological, and they play out in the political arena. So you have the Justice Department and politicians who want to give law enforcement these backdoors; you have the tech companies and privacy advocates on the other side. I think what this interview illuminated for me is a lot of law enforcement concerns at the grassroots level are very practical. They just want access to information that's going to help them solve crimes, help them in criminal investigations.
Dave Bittner: Right.
Ben Yelin: They don't really have a huge ideological commitment to getting the government to have this type of access to encrypted communications. And, you know, I think they largely have a healthy respect for both their legal obligations and the privacy rights of consumers. It's just about their practical needs to get data, and sometimes they don't realize - because they're immersed in other law enforcement work - exactly what it will take to obtain that data. And so I think that's what's sort of eye-opening to me. It's - we generally only hear about this problem from the perspective of broader federal policy or high-profile online smackdowns between the president and the CEO of Apple.
Dave Bittner: Right. Right.
Ben Yelin: And, you know, I think your interview here just sort of made it more real and talked about some of the practical concerns facing law enforcement agencies across the country...
Dave Bittner: Yeah.
Ben Yelin: ...Especially those that don't have the type of resources where they can have expertise on the force, you know, who know how to deal with digital data.
Dave Bittner: Yeah.
Ben Yelin: So I thought that was definitely something that was eye-opening and valuable to hear.
Dave Bittner: Yeah. Well, our thanks to Tommy Ross for joining us. And that is our show. We, of course, want to thank all of you for listening.
Dave Bittner: And we want to thank this week's sponsor, KnowBe4. If you go to kb4.com/kcm, you can check out their innovative GRC platform. That's kb4.com/kcm. Request a demo and see how you can get audits done at half the cost in half the time.
Dave Bittner: Our thanks to the University of Maryland's Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com.
Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.