Generating more data with value we don't understand.
Andrew Burt: We generate data that has value that we just do not understand. And we are committed to generating more and more of that data.
Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner, and joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Ben Yelin: On this week's show, I have a story about websites tracking your every online move. Ben describes concerns over allegations that Homeland Security was tapping protesters' phone lines in Portland. And later in the show, my conversation with Andrew Burt - he's a visiting fellow at Yale Law's Information Society Project. He's a former FBI policy adviser and chief legal officer at Immuta. We're going to be discussing Google geofencing among other things, so be sure to stick around for that. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Ben Yelin: All right. We've got some interesting stories this week. Ben, why don't you start things off for us?
Ben Yelin: Sure. So I will admit at the outset, the way I was alerted to this story was from the Twitter feed of Edward Snowden.
Ben Yelin: So you can judge me accordingly. But he linked to an article on The Nation website - "Federal Agencies Tapped Protestors' Phones in Portland." We've talked about some of the investigative tools that federal law enforcement has used as it relates to the protests over this past summer in Portland, Ore. I think this article is the first one that really lays out the entire story, including some things we didn't know.
Ben Yelin: One thing that really caught my eye is that the Department of Homeland Security engaged in what's called cellphone cloning. So that involves taking a phone's unique identifiers, copying them to another device and then intercepting the communications received by the original device. (Laughter) It's hard to believe that you can get any more intrusive than that in terms of invading somebody's privacy if you can see what applications they're using, who they're communicating with, et cetera, et cetera.
Dave Bittner: Now, does this require a warrant?
Ben Yelin: It does not require a warrant. I think this is actually going to get to what you discuss in your interview on the show today. But this is such a gray area of the law. Our legal system hasn't figured out how to deal with technology like this. And so even though this is in the realm of getting content - this is not just metadata - from my eye, it doesn't appear that this would require warrants. It's also a new technology, so I don't think - and I'm not aware of any case law around cellphone cloning.
Ben Yelin: And the other thing is they - you know, (laughter) whether they need a warrant or not, they did it. And you know, I think that's one thing that's somewhat disturbing about this story. As we've talked about, the government generally has more leeway when they are trying to protect us against foreign threats. And so what happened is the Department of Justice, under Attorney General Barr, declared and antifa a terrorist organization. Generally, to get that categorization, you have to have some nexus with a foreign terrorist organization or power. And it seems like they found something, like, very tangential, like somebody tweeted something in a different country about antifa, and that justifies this designation.
Ben Yelin: But once this designation has been put on a group, that potentially unlocks a bunch of surveillance tools to be used legally that otherwise would not have been available. And so I think that designation seemed silly to people at the time just because antifa is a group that's largely disorganized. There's no senior leadership. To my mind, it does not seem like it's a centralized organization. It's a bunch of people who are largely anarchists who happen to call themselves antifa. But once that designation was given to them, it unlocked, as I said, a lot of different intelligence-gathering tools. And all of those were being deployed this summer during these protests.
Dave Bittner: How convenient (laughter). Of course I'm laughing in a gallows humor kind of way.
Dave Bittner: So here's what I'm trying to understand. Protesting is legal.
Ben Yelin: Yes.
Dave Bittner: Peaceful protesting is legal.
Ben Yelin: As of this recording, that's true. We'll see...
Dave Bittner: It is enshrined in the Constitution. It is overtly named (laughter) in the Constitution, right?
Ben Yelin: Yes. Yes, it is.
Dave Bittner: OK. So obviously, looting is not. Property damage is not. I'm trying to understand the leap here. And is it merely a leap of convenience, in your view, for law enforcement, for Homeland Security to designate protesters as terrorists to be able to amp up the tools that they're using against them?
Ben Yelin: I mean, it is - it's certainly a legal thing for them to do. Oftentimes, peaceful protests will be hijacked by certain people who are going to commit violent acts. And once those acts are committed, then, you know, you're not into a territory where somebody is being targeted solely because of their involvement in a First Amendment-protected activity.
Ben Yelin: Now, it's unfair because most of the people who do attend these protests are participating peacefully and are not involved in looting and rioting. And yet, the actions of the looters and the rioters are justifying the federal government coming in and using tools that are going to infringe on these First Amendment-protected activities. But that's not unusual. I mean, it happens all the time. We have racketeering laws where, even if you're loosely affiliated with an organization, you're criminally responsible for everything that happens as a result of that organization.
Ben Yelin: So you know, this certainly is not uncharted territory. And I think it's one of those things that's very dangerous precedent to set because it's so easy to attach these arbitrary designations, like, you know, a terrorist organization, to justify things that not only offend our Fourth Amendment rights but also offend our First Amendment rights. This doesn't require any consent on the part of Congress. This is a decision that can be made by the Department of Justice, by the attorney general, generally not something that can be challenged in court because it's mostly a - what we call a political question, which is something that has to be resolved between the two branches of government. It's just one of those things that, you know, I think should make people rather fearful, if I'm being perfectly candid.
Dave Bittner: Interesting element to this article in The Nation - they describe the way that some agents were asked to participate in this. They speak to a former intelligence officer, who says (reading) they asked for volunteers and nobody wanted to go. The fact that they asked for volunteers shows that it was outside the scope of their duties. You only do that if you don't have the ability to order someone to go, probably because it's illegal.
Ben Yelin: Yeah. That certainly was eye-opening to me. I mean, you generally don't see the collection of volunteers for law enforcement operations, especially when there is some sort of counter-intelligence aim here as, you know, it seems to be that there is. And we've seen other stories this past summer from other protests where non-related federal agencies have had agents who are - have been dispatched to these protests. They've been deputized to carry out the orders of the executive branch. And that's something that's not very healthy to see in a democratic system.
Ben Yelin: You know, we read stories earlier this summer - and this is mentioned in this Nation article - that the Drug Enforcement Agency, the DEA, was used to help hack into protesters' phones. And the DEA, because of their work in prosecuting drug crimes, has that capability. And so there's been allegations that they'd been deputized to deal with that as it relates to these protests.
Ben Yelin: And you know - so once you start to have a whole-government effort where involvement by various agencies is secretive, it's largely classified and then you start to hear anecdotes about conscripting volunteers to join this organized effort, it all seems to be very dark in my view. I mean, this is just not something that's healthy in a small-d democratic system. And it's certainly something that should worry me - or that should worry you, rather.
Dave Bittner: It should worry us.
Ben Yelin: It already worries me, yea.
Dave Bittner: Right (laughter). The - yes, the Us with a capital U. What are some possible ways that we could have better oversight on something like this, a better control to rein this in?
Ben Yelin: You know, Dave (laughter), it's a tough question to answer. Our mechanisms of democratic accountability are pretty thin right now. I mean, if you're unhappy with how the executive branch is conducting these types of operations, theoretically, Congress is supposed to be a check on that behavior. Congress has various tools at their disposal, and one of them is subpoena power. So you know, you can subpoena the records of federal agencies and get them to appear in front of congressional committees and tell the full story. That's much harder because during the Trump administration, there's been a reluctance on the part of the administration to respond to lawful subpoenas. And they've had success challenging these subpoenas in court, at least in some cases.
Dave Bittner: People just aren't showing up. They're just refusing to show up.
Ben Yelin: Yeah. They just won't show up to these hearings. And..
Dave Bittner: Time was, you'd get dragged in (laughter) - right? I mean, is it - they're legally - I don't want to go down that rat hole. All right. Continue, please (laughter).
Ben Yelin: Yeah. And then - you know, the other tools for Congress - you can hold somebody in inherent contempt, which means you physically restrain them until they comply with a subpoena. That's a tool that at least members of the House of Representatives have not been willing to use. The other tool is impeachment. We've already played that card, and it seems like, you know, there would not be the votes to convict any administrative official, even if there was a clear pattern of lawlessness. And so we're in a very difficult place. There is only so much that the court system can do to protect us from overzealous law enforcement.
Ben Yelin: Oftentimes, if the courts do get involved, it's after the fact. Maybe during the actual criminal prosecution, the defendant could seek to suppress evidence, and then maybe there'd be some accounting of the government's surveillance behavior. But there's very little that can be done in advance of these tools being deployed. And I just think that the Justice Department is getting more and more ambitious in terms of countering these protests.
Ben Yelin: This article mentions that, just this past week, the attorney general urged federal prosecutors to consider charging violent protesters with sedition, which is a very rarely used criminal cause of action to charge people with wanting to overthrow the government, which, despite what some of these people might say, they don't appear to me to be a legitimate threat to overthrow our duly elected government.
Dave Bittner: Well, I saw Florida - the governor of Florida this week said that he wants to increase the penalties for crimes committed during protests, which, again, could have a chilling effect.
Ben Yelin: Absolutely, yeah. I believe that those - that new statute in Florida will be challenged in court. There's some things in that statute that certainly raised my eyebrows, including allowing people to act in self-defense if they're feeling threatened by protesters. Obviously, that could have dangerous implications.
Dave Bittner: (Laughter) Yeah, that's come up in Florida before, hasn't it?
Ben Yelin: It sure has, yeah.
Dave Bittner: (Laughter).
Ben Yelin: History has a way of repeating itself. And then, you know, the last thing I'll say is, just this past week, we've had the attorney general designate American cities as anarchist jurisdictions, which unleash - I mean, it's a designation that's pretty much never been used. And it unleashes a torrent of federal powers. This is being challenged by those jurisdictions themselves, including New York City. You know, the fact that New York City was declared an anarchist jurisdiction by our federal government this week doesn't seem to have pervaded our news environment too much. And you know, I've seen a lot of people posting joke pictures on social media about anarchist New York, you know, where you see two people drinking a cup of coffee in the park or something.
Dave Bittner: Right, right (laughter).
Ben Yelin: But you know, this is a level of escalation on the part of the Department of Justice, using these tools, using these rarely invoked authorities, that I think require us all to be extremely vigilant, if I'm being frank.
Dave Bittner: All right. Well, you know, we often say this is something worth keeping an eye on here. I would say in this case even more so - right? - as you say, vigilance.
Ben Yelin: Yeah, absolutely. Stay informed, you know, especially if you are intending to participate in these protests. You know, make sure that you are using all of the privacy tools you can to encrypt your own communications. But also, you know, from a political perspective, if this is something that bothers you, certainly use your tools in our democratic system to seek change.
Dave Bittner: Yeah. All right. Well, we will have a link to that story in the show notes, as always. My story this week comes from an organization called The Markup, which is a nonprofit news organization. And it's titled "The High Privacy Cost of a 'Free' Website", written by Aaron Sankin and Surya Mattu. It builds off of a tool that this organization has built that's called Blacklight. And it's a fascinating tool. If you enter in a website into this tool, Blacklight, it goes through, and it scans the website. And it'll give you a list of all of the trackers that are active on that website. And what's remarkable about it is how surprising, I think - you - just try it out. I don't know if you've tried it out or not, Ben, but try it out. And you'll see most websites have more trackers than you probably thought they did (laughter).
Ben Yelin: Absolutely. I mean, you should read the statistics in this article - and we'll post this in the show notes, as well. I think you would be surprised at how much tracking is going on even at websites where they allow you to accept cookies and that sort of thing. They're still doing things like session recording and fingerprinting users. It was certainly eye-opening to me.
Dave Bittner: Yeah. And one thing that the article points out is that a lot of these trackers make their way onto websites through the website's use of free tools. So for example, if you are using the free version of Disqus, which is a tool that allows you to have comments on your website - if you're using free tools from Facebook or from Twitter or even Google, with those tools come the ability for advertisers, what they refer to as piggybacking trackers, to be installed on your website. And they can be very intrusive in the terms of the things that they track and how they track you around the web. Some of the chilling things that they found in their research for this article - SunTrust Bank was sending usernames and passwords to a third party.
Ben Yelin: That's not good.
Dave Bittner: Not good, not good.
Ben Yelin: Yeah.
Dave Bittner: The third party said that they encrypt and discard the data. But after the folks here at The Markup contacted SunTrust, that functionality went away (laughter), as it so often does. They scanned hundreds of sites. For example, 80 U.S. abortion providers loaded third-party trackers, some of them sending data to Facebook that ended up in user profiles.
Ben Yelin: Yeah. I mean, this lists some of the most personal websites somebody would visit - a hundred websites serving undocumented immigrants, domestic and sexual abuse survivors, sex workers, LGBTQ individuals, government websites, health websites like WebMD, Mayo Clinic. I mean, these are sites where you have an expectation that your information will remain private.
Ben Yelin: What's striking to me is that this is all generally a matter of convenience. You know, obviously it's nice to have Disqus commenting features on a website - it does add a level of convenience. Integration with things like Facebook and Twitter is very helpful. Being able to log into, you know, your GrubHub app via your Facebook profile where they say they're only you know, collecting your username and a very limited amount of information. It really does aid in your convenience. It saves you time. But as a result of this convenience, you're getting these piggybacking trackers that are pretty invasive. So it's not just Facebook that you're granting information to when you allow Facebook to access your profile so that you can order your dinner. You're giving access to several, if not hundreds, of piggybacking trackers that come with the Disquses and the Facebooks and the Twitters of the world.
Dave Bittner: Yeah. One of the ones that caught my eye was from the Mayo Clinic. They use keylogging to capture information, and they can send that off to marketers for tracking. What's interesting about keylogging is that, when you're filling out a form on a website, I think most people probably think that you fill in the form - let's say you type in your name, right? So you type in, you know, Ben Yelin, and then you hit the return key, and that information gets sent off to the website. But with a keylogger, they're logging every key you type as you type it. So as you're typing in your name, that information is being sent off. And you can imagine someone perhaps being timid about some sort of medical condition they have - right? - and they're wondering, should I search for this? And they type it in, and maybe they think twice about it and erase it. Well, too late.
Ben Yelin: You can't change your mind. Yeah, you can't change your mind when it comes to this type of tracking because it's logged - it's keylogged before you press submit.
Dave Bittner: Right. Right. And I suspect that most people are unaware of that.
Ben Yelin: I think they are. And it's not just most users that are aware of it. What this article says is, while some sites knew about these trackers, others - many others said that they were unaware of the pervasiveness of these trackers or what happens with the data once it's collected. And a bunch of these organizations didn't respond to calls for comment, meaning, you know (laughter), they probably don't have answers to these questions as well.
Dave Bittner: Not good answers, anyway (laughter).
Ben Yelin: Right, right. They don't have answers that will satisfy the individuals working for Blacklight or The Markup. And that's probably why they they didn't respond. So yeah, I mean, this is - we're talking about potentially personally identifiable information or sensitive medical information that's captured as a matter of course and sent to third-party tracking services without people being aware that it's happening. And it's more than just not being aware. You think that you've protected against this by disallowing cookies or taking other protective measures. But this type of collection still happens. And you know, I think that's something that's going to rub people the wrong way.
Dave Bittner: Yeah. Well, this is an extensive article. It goes into many of the different ways that these tracking technologies work. I highly recommend. It's worth a read. But then also, if there is a site that you're considering visiting that you think - you know, you're concerned sensitive information being shared, you know, maybe a thing to do is to run it through this Blacklight scanner that the folks at The Markup have built to see exactly what's going on behind the scenes, give you a better window of what you may be inadvertently sharing.
Ben Yelin: Yeah. I mean, a lot of this is on the users themselves, which is why I think this Blacklight tool is particularly important. The laws in this area really will not protect you much. The only federal law - and this is mentioned in the article - requiring websites in the U.S. to disclose user tracking applies to websites primarily serving children. So when my kids watch their Cocomelon and Little Baby Bum YouTube videos, maybe it applies then.
Dave Bittner: (Laughter) Right.
Ben Yelin: Those parents out there will know what I'm talking about.
Dave Bittner: Yeah, OK (laughter).
Ben Yelin: The Federal Trade Commission can also go after companies for deceptive practices - you know, for companies that claim that they're not doing any tracking but they actually are. But otherwise, the law provides you very little protection. So this is one of those instances where it's incumbent upon all of us to use the tools that are available as a prophylactic for this problem. And you know, sometimes we're going to get those scenarios where the government or our legal system is not going to protect us and we're going to have to protect ourselves.
Dave Bittner: Yeah. All right. Well, again, we'll have links to that article in the show notes. Highly recommended - check it out. Really an eye-opener there. We would love to hear from you. If you have questions from us, you can call in. Our number is 410-618-3720. You can also send us email. It's email@example.com.
Dave Bittner: Ben, I recently had the pleasure of speaking with Andrew Burt. He's a visiting fellow at Yale Law's Information Society Project. He's also a former FBI policy adviser. And he is the Chief Legal Officer at Immuta. And among other things, we're discussing Google geofencing. Here's my conversation with Andrew Burt.
Andrew Burt: Geofencing, kind of writ large, is this practice of the government believes a crime occurred. And they'll go to very large technology companies, like a Google, and they'll say, we want to know information about all of the different devices and all the different people that might have been within this location - so, like, think about it as a geographical fence, hence the name geofencing - anyone who's been in that location for this specific window of time. And so the government will go directly to a company like Google, and then they'll get information from Google that might then lead to an arrest or further kind of criminal investigation.
Andrew Burt: And the reason why this is so kind of controversial is that our current legal system and our current laws are really not designed for these types of technology companies to be the arbiter of what types of data the government gets to use for criminal investigations. And so there are really serious privacy concerns. It just is kind of like - it's a new framework, frankly, when the government can go directly to a company rather than needing to go before a judge to determine probable cause to try to get some of this information.
Dave Bittner: And what is the obligation that a company like Google has to provide this information?
Andrew Burt: Right now - and this is - I mean, frankly, it's partially what makes this so interesting and I think why a lot of legal scholars are spending a lot of time thinking through these issues. But it's also why it's so hard because it's really discretionary. There are a bunch of different laws and regulations that govern, you know, what Google might be able to do and how they can give this data to the government.
Andrew Burt: It is true that the way that Google decides what data the government gets to have is just not the same as if the government were going, you know, directly to a judge in the judicial branch and saying, you know, we want to take out a warrant. We want to look at, let's say - you know, the equivalent would be, say, like, the physical files in someone's office or in their cabinet. Just for a host of reasons, many of them kind of historical - and frankly, and I'm happy to talk about this - I think there's been just kind of a large-scale failure in the policy community and the legal community to conceive of data the right way. But because of that, our expectations and the types of data we generate and the value of that data just really don't correspond to the types of protections that attach to that data when companies like Google collect it and house it.
Dave Bittner: Let's dig into that some. What are some of your concerns there?
Andrew Burt: I have many. I mean, how long do you have? I have many concerns.
Dave Bittner: (Laughter).
Andrew Burt: I'll just - I'll kind of list some. And I think this was 2011 or 2012. There was a very famous - and this doesn't have anything to do with criminal investigations, at least on its face. But there was a famous case where Target began to target one of its customers, a teenage girl, with pregnancy products before her family even knew she was pregnant. And so this was basically almost a decade ago. And really what it stands for is this proposition - and this was based on her shopping patterns. So I don't know exactly what it was that she bought that alerted Target to the fact that she was likely pregnant. But she went, she bought, you know, X, Y and Z. Target said that means that she is pregnant with this probability and began to send her things to her actual house. And her family - apparently, that's how they learned that she was pregnant.
Andrew Burt: And so that story, which is a decade old - and since then, there have been many more advances in AI machine learning. The iPhone was only 5 years old at that time. We generate a huge amount more data now. And so that little kind of anecdote, I think, just stands for, in my mind and in others, this proposition that we generate data that has value that we just do not understand. And we are committed to generating more and more of that data. In fact, I wrote a white paper that, if folks are interested in, it was for the Hoover Institution at Stanford with Dan Geer, who I - frankly, I think is the smartest person alive on cybersecurity. So I recommend to anyone who can to spend time reading some of what he said.
Andrew Burt: So anyway, we wrote this paper together, and we called it "Flat Light". And flat light is the name for - it's kind of like white-out. It's a name pilots give for when a pilot has lost basically all orientation for - you know, are they going up? Are they going down? What direction are they moving? And our real thesis is that, in the world of data, we live in a state of flat light. We just - we don't know how to orient ourselves because it's hard to understand what we're generating. It's impossible, frankly, for us to understand what the value is.
Andrew Burt: And so I'll just list kind of some of the tensions that this new state of data and machine learning has kind of placed us in. So I think one of the things that our current legal system is struggling with is, what makes something invasive? There's a famous Supreme Court case called Carpenter about whether or not the police could track someone they're investigating, whether or not they could put a GPS tracker on their car. What came out of the Supreme Court there was that - and lawyers, forgive me for kind of summing this up too coarsely - but the longer you are tracking someone, the more invasive it is. And so the Supreme Court basically looked at time as a key measure of kind of how invasive, you know, whatever the method is. I think it's really interesting, but I don't think that's entirely right because as we generate more data and as, you know, machine learning techniques become more powerful, time is less important. And frankly, we can have really small bits of data that generate kind of insights about us that we don't know.
Andrew Burt: There's also this distinction between public and private space, which is kind of, like, the core of the Fourth Amendment. It's the core of most privacy frameworks. And that is quickly kind of going out the window. What counts as private - if I'm in my home on my phone doing something, rendering one webpage, which might load and take content from, like, a dozen different third-party servers outside my house?
Andrew Burt: So anyway, I can go on. But there are all these basic conceptions that make sense for a physical world, you know, for frankly most of our existence up until the end of the 20th century. But they start to break down when we live all of our lives and more and more of our lives, especially in the pandemic, online.
Dave Bittner: You know, one thing that my co-host, Ben, and I were discussing when it comes to some of these geofencing cases was the difference between, for example, law enforcement going to Google and saying, hey, we want, you know, all the information for everyone that was in this area between this time and this time - just give it all to us, and then we're going to sort through that and see what's interesting - versus, let's say, law enforcement having done a bunch of work ahead of time and then going to Google and saying, hey, we are interested in this one person. We want to know, did this single person enter this area during this time? And it seems to me like there's a fundamental difference there. First of all, I mean, do you agree that that's a difference that's worth discussing?
Andrew Burt: Yeah, I mean, absolutely. There's a really, I think, landmark case kind of working through the Eastern District of Virginia that really focuses on exactly what you said, which is basically, what does the actual interaction need to look like when the government goes to Google and says, we want this data? Or frequently, it's like, we believe a crime occurred during this period. What data can you give us that's helpful? And then there's this kind of negotiation.
Andrew Burt: And I think what is unsettling right now, frankly for both people in the privacy community, you know, folks in the law enforcement community, some folks like myself who would say that, you know, they straddle both - what's unsettling is a lot of that back-and-forth is just, frankly, up to Google. Like, it's kind of - they are hashing it out. They are figuring out what the standards are. And I think that needs to change. But I think certainly - I think that kind of the future that I think we need to get towards is where there's kind of, like, different phases of kind of particularity. And of course, evidentiary requirements for each would get stricter and stricter and stricter until, you know, the government's actually asking for information about one particular individual.
Andrew Burt: I'd also just make one broader point, which is that when we're looking at the literature and we're reading headlines about this and we're thinking, you know, like, how does this impact me, and kind of even outside of the criminal context, like, why is this such a big deal? I do think it's important just to highlight - this is the exact same debate we're having with COVID and contact tracing. And you know, sitting in my shoes, where I'm focusing on, you know, the risks of using data at scale for value in a security context and a privacy context, you know, what have you, it's really, really, really similar. Like, it's the same issue.
Andrew Burt: And so when we're thinking about COVID and we're thinking about, how can kind of we collectively, which means the government - how can we collectively get the data we need to keep us all safe? How can we collectively, like, have a good picture of what's going on, versus, how on an individual level can we protect our privacy? How can we protect our individual data? There's this kind of big give-and-take. And so that give-and-take that I think we're seeing in the criminal context is very similar to lot of the debates about contact tracing.
Andrew Burt: And at the end of the day, it's really, you know, a body acting on behalf of the collective, which is the government, trying to figure out, did something bad occur here that could pose a threat? And so it's just these issues - because we spend so much of our time in the digital world, they bleed into each other. And so I would just make the point, like, even though it seems like this is only about criminal investigations, it's really about something broader.
Dave Bittner: Now, is it the case that law enforcement will do some - I guess you could call it shopping around? You know, in other words, if Google said, hey, you know, we're not comfortable giving you this information, could they then go to Verizon or AT&T or - you know, see who's most willing to give them the most information from what they're looking for?
Andrew Burt: I guess I would have two responses to that. The first is, I don't know what would stop them from doing that. I can't think of a reason why they wouldn't be able to do that. And I think that that is really the core concern. And if you look at how Google is handling a lot of these requests, you know, there's always, like, a creepy factor. And just the amount of information that Google has on any of us is just wildly creepy and, in some cases, upsetting. And so that - you kind of always have to, like, separate, like, how much of this is just - it's so creepy how much information we generate and how much, you know, certain companies collect.
Andrew Burt: But I think kind of outside of that, if you look at how they've been handling a lot of these requests, I think they're doing a very good job. It's not as clear as I think it needs to be. I think lawmakers need to step in to make the standards clearer. So I think Google is doing a good job. But the biggest concern is that other smaller companies with less resources and less kind of, you know, legal expertise and skills are probably not going to hold up to the same standard. And so I think, you know, unless the standards become clearer. Just - if you look at the trends, as we generate more data, as we interact with more and more companies who are collecting our data and using our data, I think that type of kind of shopping around that you just described is just going to be more and more likely. And again, if nothing is stopping, you know, our investigative agencies from doing it, it's what they're going to do. Again, like, I think we are in this era where we're just lacking a lot of structure, and we're lacking a lot of understanding. And I don't think any of these organizations or actors are acting with ill intent. But I think we're just kind of - we're waiting, and we're really desperately needing some more clarity.
Dave Bittner: What would you like to see? What sort of standards would put you at ease, would make you comfortable?
Andrew Burt: You know, that's a really good question. And I'm not sure I have a great answer because I'd like to give you as detailed an answer as possible. I mean, I think in general, what I would like to see is some type of legislative requirement that spells out directly kind of the standards that the government needs to meet that are reviewed by the judiciary, by the actual branch that's supposed to be a third party, thinking about individual rights rather than trying to kind of expedite investigations. And I would like them to be involved in some way. So I guess the big, high-level answer is, rather than this being a conversation between, you know, law enforcement and technology companies, I would like the judiciary to be more involved. And it doesn't mean that they need to be, you know, slowing things down because I think the priority also needs to be on getting law enforcement the information it needs as quickly as possible to solve these crimes. But I think the idea that our judiciary is kind of passive as a lot of these requests are happening - I think that is a future we definitely want to ensure kind of doesn't arise.
Dave Bittner: Yeah. I suppose if we were able to lay out what the standard is that you - what is expected when you make your case in front of a judge, if it is seeking a warrant or, you know, something like that - that at least we know what the rules are. We know what kind of playing field we're on.
Andrew Burt: Right. Exactly. And honestly - and this is - this will benefit, I think, law enforcement, as well because I think the question also needs to be, like, when can Google or whatever tech company - when can they say no? When do they say, no, you can't see this. So there's some clarity. And when do they have to say, yes, you can see this? And what exactly is the type of data that they can show?
Andrew Burt: I would also just plug - you know, I'm closely involved in the use of privacy-enhancing technologies. These are things like differential privacy or federated learning. There are a whole host of different kind of technological solutions that can both kind of preserve utility of data on the one hand and also protect privacy and security on the other. And it's always a tradeoff. So there's no silver bullet. But I think there is room for some of these types of solutions where we can have something like the judiciary set, you know - here is the standard for what the data needs to look like. Here's the standard level of anonymization that that data needs to kind of be held to. And then I think there's a future where, like, for the initial query, if that standard is met, law enforcement agencies could query that data directly but while still protecting kind of a large amount of the privacy.
Andrew Burt: So I know, I mean, that's a lot to say, and there are a million little details that need to be filled in. But I think the major point is that, I think if we're smart about how this works and we think about existing solutions and existing technology, like, there is a way that we can increase access to data while also ensuring it's protected so that this is not just kind of a binary conversation. Like, either law enforcement gets everything and they get it immediately, or they don't get anything, or they get a very little bit of data, and it takes a really long time.
Dave Bittner: All right, Ben, what do you think?
Ben Yelin: I loved this guest.
Dave Bittner: (Laughter).
Ben Yelin: We have a lot in common in terms of our perspective and our areas of interest. So it was of particular interest to me. And geofencing is such a novel technological issue and legal issue that it's very interesting to hear somebody else's perspective. I think one thing he highlighted that we've talked about before is the tech companies themselves are unfamiliar with the legal territory here. He had that great metaphor about - I forget what the term is. Was it whitewashed? Where - remind me if I'm quoting this correctly. But you're flying, and you can't see where you're going. You have no sense of direction.
Dave Bittner: Right. Right. Like a white out, yeah.
Ben Yelin: A white out. That's what it was.
Dave Bittner: Yeah. Yeah.
Ben Yelin: So, I mean, I think our legal landscape has a white out when it comes to this type of new technology. And so the responsibility ends up falling on companies like Google, where they have to determine for themselves what information they're willing to give to law enforcement. And that's a lot of power that we're giving to Google. So, you know, I think it's just a very pervasive problem that he points out, and it's something that's so common with so much of what we've talked about on this podcast.
Dave Bittner: Yeah. Well, again, our thanks to Andrew Burt for joining us. We appreciate him taking the time and sharing his perspective.
Dave Bittner: And that is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.