Caveat 8.5.20
Ep 40 | 8.5.20
The evolution of the internet.
Transcript

Max Kirby: We're really at the point now where the web is mandatory to survival. You need an email address and a credit card to participate with the economy now. We're never going back to a world where that's not true.

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin. He's from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, I've got the story of the Google geofencing location requests that have been coming under scrutiny, Ben describes how federal officers are using social media streams to track down protesters and, later in the show, my conversation with Max Kirby. He's the leader of cloud solutions at Publicis Groupe. We're going to be talking about the evolution of the internet. How did we get to this point? 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we got some good stories this week. Why don't you start things off for us here? 

Ben Yelin: Sure. So my story comes from Recode. It's another story emanating out of these protests in Portland, which I know we discussed on last week's episode. And this is about how the federal government has used a YouTube livestream in an affidavit in support of the arrest and prosecution of one of the protesters. So there are a lot of different YouTube livestreams of the protests in Portland. There's obviously a lot of public interest in it. And there's one site that aggregates a bunch of different YouTube livestream. It's branded Live from the End of the World, which I guess is only appropriate. 

Dave Bittner: (Laughter). 

Ben Yelin: And it's also co-hosted on Twitch. This is something where, you know, if a citizen is starting a livestream, that livestream might be picked up by this aggregator. 

Dave Bittner: I see. 

Ben Yelin: And what happened here is the livestream caught three individuals committing arson. They caught sort of the person who wasn't committing the most egregious part of the crime. One person was actually setting the fire. Another person was kind of in the background, removing bricks and stones and kind of aiding and abetting the crime. Unfortunately for that individual, his was the face that showed up on this YouTube livestream. The other person, luckily for him, was able to conceal himself. So they did some investigative work, figured out who this person was. He has been arrested, and he is going to face prosecution. 

Ben Yelin: I think this is another example of how protesters, especially in these high-profile political powder kegs that we have right now, particularly in Portland, need to be very conscious about the fact that they are being broadcast, no matter where they are. There are too many potential surveillance methods out there that when you are taking an action that could expose you to criminal liability, you should expect that you are being surveilled. What we've been hearing from the federal government, from both President Trump and from the Department of Homeland Security, based on their words and actions, is that they are going to take every action necessary to quell these protests. I'll note that those actions so far have not yet been successful. But, you know, this is just part of that effort. 

Ben Yelin: So, you know, I just think this is another lesson for people who are exercising their First Amendment rights to protest that you are being watched. Whether it's a YouTube livestream, whether it's security cameras, you know, whether it's cameras embedded inside Lyfts or Ubers that are surveilling what's going on on the streets outside of them, somebody is watching. Somebody is patrolling the city. And, you know, I just think that's something that people really do need to be aware of. 

Dave Bittner: Well, you know, something that strikes me about this is that a lot of these livestreams are coming from the protesters themselves. You know, they're trying to hold these federal officers, whatever they are - it's hard to know what they are because they're... 

Ben Yelin: We really don't know, yeah. 

Dave Bittner: (Laughter) Yeah. These federal enforcers - whatever you want to call them - they're trying to hold them accountable. One of the ways they're doing that is through the livestreaming. So it kind of strikes me that if some of these streams are coming from the protesters themselves, they're posting it publicly, I suppose that's not out of bounds for law enforcement to use that as a source. If someone is behaving illegally and setting fires - it's certainly out of bounds when we're talking about peaceful protests - is it unreasonable for law enforcement to use that to try to track someone down? 

Ben Yelin: It's not. And, actually, Vox Recode interviewed the individual who's been responsible for aggregating these videos, and he said, you know, there's no legal authority that can stop law enforcement from using the footage, whether it's taken by members of the media or other protesters. These are people in public view. They've lost their reasonable expectation of privacy by being out in public, especially during a large-profile protest such as this. 

Ben Yelin: So this aggregator said, you know, we denounce the practice by law enforcement. We want them to develop standards so that they're not exploiting people who are trying to legitimately document what's going on at these protests. But, you know, he really does not have any legal recourse. 

Ben Yelin: Now, individual users who are taking these videos, they can request to have their videos taken down by this aggregation service. But, oftentimes, you know, an individual who is, you know, streaming a protest is not going to know that it's their feed that's the one that's going to identify criminal suspects. 

Dave Bittner: Right. Well, let's unpack - I mean, just to be super clear here, let's unpack the legal aspects of this. I mean, is it right to say that, by virtue of streaming this live, I'm putting that out in the public forum, out in public, and so there is no expectation of privacy? This is a - it's a public place being streamed live to the world, so law enforcement wouldn't need any kind of warrant to view this or use this or request this because it's basically that - the point in livestreaming is to share it as widely as possible. 

Ben Yelin: Absolutely. I mean, the minute you basically get yourself in a place that's viewable by the public, whether that's law enforcement or any other member of the public, you have forfeited your reasonable expectation of privacy. And the Fourth Amendment only applies when there has been a search. According to our precedents, there is only a constitutional search even if somebody's physical property has been intruded on or if there has been a violation of that person's reasonable expectation of privacy. And since neither of those standards apply in these circumstances, the government does not need a warrant to obtain that information. 

Ben Yelin: Now, obviously, this is the legal doctrine. We have to live in the world that we've set up for ourselves. You know, whether that's the correct doctrine is certainly up for interpretation. I mean, are people really forfeiting their expectation of privacy simply by stepping onto a public street? Possibly. But as we've talked about a million different times on this podcast, when this doctrine was developed, we didn't have the type of pervasive surveillance state we have now. You know, maybe at a large protest, especially when you're burning stuff, you might be expected to be videotaped. 

Ben Yelin: But in other circumstances, when you're just walking down the street, taking a stroll and, you know, somebody else is doing some birdwatching and catches you on their YouTube livestream committing a burglary or something, it seems at least questionable that you do not have a reasonable expectation of privacy. You know, that's not something that courts have agreed with, but I just think it's something that we're going to have to contend with, especially as surveillance becomes this pervasive. 

Dave Bittner: Yeah. It's interesting, the panopticon that we've built for ourselves here. 

Ben Yelin: Yeah. I mean, the other thing is there is obviously an interest among the protesters that the media document what's going on, and the primary reason for that is that they want to hold law enforcement accountable for law enforcement's own actions. You know, the reason there's been public outrage about people getting arrested in the streets by unnamed federal agents and being put into unmarked cars is because that has been documented by both the media and by individual protesters. 

Dave Bittner: Right. 

Ben Yelin: So there's obviously a huge benefit in terms of police accountability of getting things on camera. 

Dave Bittner: Yeah. I think to myself, you know, how many times, as a person of extreme privilege just by nature of who I am, what I am, where I was raised, where I live and all the advantages I've experienced throughout my life, I can totally imagine myself saying, well, that could never happen. You know, hearing a story of something - well, that - there must be an explanation behind that. But then I see the video, and I'm like, yeah, well, OK, that happened. Time to recalibrate my own beliefs, my own (laughter) - you know? 

Dave Bittner: Like, so to me, it's been - it's really been a lesson for myself in my own expectations and my own biases from the life I've lived, and video has made that possible, you know? This - I feel like I've bettered myself to have better empathy and understanding for people who've been saying for years, no, really, these things happen, and I've thought, well, that can't happen. Well, no, they happen (laughter). You know, they're really happening. 

Ben Yelin: Right. Exactly. And so I don't think any of us would want to discourage people from trying to record what happens. First of all, obviously, it's good for historical purposes. We want to document what's gone on in these protests. It is a political movement. 

Dave Bittner: Right. 

Ben Yelin: It is, frankly, a wrought political moment. I mean, I would like there to be documentation of the time that the president, despite the contrary wishes of local officials, sent in federal agents to patrol a city using a variety of surveillance and law enforcement techniques. I want that to be documented so that I can... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...Teach classes about it and warn my grandchildren about it. But there is another side of this equation. It doesn't seem that morally objectionable because, in this particular circumstance, people were actually committing crimes; they weren't just protesting. 

Dave Bittner: Right. 

Ben Yelin: But, you know, you had an example of a story I saw yesterday of an individual who was placed into an unmarked car by nonuniformed police officers in New York City, and the New York Police Department released a statement saying that this was a suspect wanted for defacing police video cameras. That to me does not seem like a justification for executing an arrest warrant on a street from an unmarked van with a squad of, you know, six or seven plain-clothed police officers. 

Dave Bittner: Right. 

Ben Yelin: But you're right. Now we know this is going on, and we have all the information we need to try and affect change and to protect people's constitutional rights and, particularly, their right to protest peacefully. 

Dave Bittner: Yeah. All right. Well, boy, what a thing to be in the midst of. What an interesting time we are in right now - to be tracking all this stuff in real time. It's that old curse - right? - may you live in interesting times. 

Ben Yelin: Yes, that is a curse. You know, when we decided to do a podcast on legal issues related to cybersecurity and privacy, we probably had no idea that so much would happen in the world of law enforcement surveillance during this time of great unrest. So... 

Dave Bittner: Right, yeah. 

Ben Yelin: I can't say it's only happened because our podcast started. But... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Let's just say I'm not ruling it out. 

Dave Bittner: OK. Oh, all right, Ben. You go ahead and believe that (laughter). 

Ben Yelin: Sure. 

Dave Bittner: Well, let's move on to my story this week. This is from The Wall Street Journal, written by David Uberti. And the title is "Police Requests for Google Users' Location Histories Face New Scrutiny." You and I have spoken in the past about police using these geofencing warrants where they will go to an organization like Google - and it seems like Google is the No. 1 source for these... 

Ben Yelin: Yeah. 

Dave Bittner: ...Sorts of things because of the amount of data that they collect and also the, I suppose, the accuracy of the data that they collect in terms of location data. 

Ben Yelin: Right. 

Dave Bittner: So they will go to Google, and they'll say, listen; a crime was committed in this area. Let's say somebody robbed a convenience store. We would like to have the information on every device that came by this location within a two-hour period on this date. And then Google will provide them with that information. Now, before we dig into some of the details of the pushback here, there's some really interesting details here from Google about how they go about responding to those requests. 

Ben Yelin: Yeah. I found that fascinating. 

Dave Bittner: Yeah. I mean, to their credit, Google is taking this very seriously and not just turning over all the information willy-nilly without any scrutiny on their own part. It says they initially provide anonymized information to authorities, and then police can come back and say which particular devices are of interest to them, and then Google may turn over the information on specific users. So in other words, Google isn't just handing over all the information about everyone who passed through an area. They're going back to the police and saying, OK, here's some information about the folks who were here, and now you need to tell us who you're interested in, and then we may provide you with that information. 

Ben Yelin: Right. 

Dave Bittner: That's more scrutiny than I think I had imagined in my mind, you know, when thinking about how these sorts of things played out. 

Ben Yelin: Yeah. So in one sense, that was very encouraging. Google does seem to have a process where they're committed to protecting user privacy. My other thought when reading that was, why is Google the one making this decision? Who died and made them king? I mean, shouldn't this be something that's done through the democratic process, where we have laws and regulations on geofencing? And we don't have that. So in the absence of that - and, you know, there has been some proposed legislation in New York state, for example, that would prohibit this type of geofencing and geofencing warrants. 

Dave Bittner: Right. 

Ben Yelin: But in the absence of that, it is up to these companies, our Google overlords, to protect our privacy for us. And I'm not sure that that's what we want. 

Dave Bittner: (Laughter) Well, I mean, the other parts of this story that are relevant is, as you say, New York state - some New York state senators have put a bill that will outlaw warrants for informal requests for these sorts of data. And the state Senator Luis Sepulveda, I believe his name is - he's a Democrat. He chairs the state Senate's Crime Victims, Crime and Correction Committee. And he is concerned that geofence warrants would be used disproportionately in communities of color the same way that some other police tactics, like stop-and-frisk, have been used. So again, you know, some of these surveillance things, like we've talked about with facial recognition, can skew towards people of color not in a good way. 

Ben Yelin: Yeah, absolutely. And you can see why the same type of rationale that's used for stop-and-frisk could be used to justify some sort of geofence warrant. This is a dangerous neighborhood historically. Obviously, that contains some not-so-subtle racial undertones. And if that's the motivation for doing geofencing, then that absolutely could reinforce some of the problems we've seen in law enforcement. 

Dave Bittner: Yeah. 

Ben Yelin: And then I know we'll get to this, but this is really a fascinating Fourth Amendment question. 

Dave Bittner: Right, right. Well, and that's the other part of the story here is we've got a couple cases - one in San Francisco, where some public defenders are trying to quash the use of a warrant like this, and there's another one in Virginia, also. Defense lawyers are making a similar argument. They're saying that this violates their defendant's Fourth Amendment rights to unreasonable search and seizure. So what's the approach they're taking here, Ben? 

Ben Yelin: So there are really two questions when it comes to any Fourth Amendment analysis. The threshold question is whether there's been a search in the first place. Here, that's actually very complicated. So in the Virginia case in particular, the parties are battling over interpreting the facts of this case in light of the Carpenter decision, which I know we've talked about. That decision says that monitoring somebody's individual cell site location information over an extended period of seven days is a search and requires a warrant. 

Ben Yelin: Now, what the government is saying is, well, this is a much shorter period, first of all. This isn't seven days surveillance. And geofencing is not, at least initially, monitoring one individual device. It's collecting information on all of the devices that are in a given geographical area. So they feel that that precedent doesn't apply. 

Ben Yelin: Now, the defense has responded, maybe you take that interpretation from Carpenter, but the reason we have a Fourth Amendment in the first place is to protect against what our Founding Fathers saw as something deeply offensive. And that's the notion of general warrants, where we'll issue a warrant to go search something, even if we have no suspicion that nobody did anything wrong. We'll give you permission to go into somebody's house and see what you can find. 

Ben Yelin: You know, I think the government would push back against that in saying, well, there's - based on how Google conducts this process, there is some level of specificity. We are looking for an individual or a set of individuals. We're not just kind of geofencing every single location and following the physical movements of every single device over a long period of time. We are actually working off some individualized suspicion. 

Ben Yelin: So, I mean, they - really, courts are going to have to settle that question first and foremost as to whether there has been a search. And then the second question is whether the search has been reasonable. And that comes down to whether you think these tactics are overbroad. 

Ben Yelin: And frankly, there are - I hate to be this wishy-washy, but there are compelling arguments on both sides of the issue. I think if we institutionalize the practices that Google has placed on itself, where you have an initial collection of anonymized data followed by, you know, some indication from law enforcement that they have information on a particular device, followed by, you know, then Google deanonymizing the data, that process to me is more likely to be deemed constitutionally reasonable than something that's more arbitrary, where, you know, if there was another company that didn't take these privacy protections and they gave nonanonymized data to law enforcement, then we'd be talking about something that was overbroad for Fourth Amendment purposes. 

Ben Yelin: So this is something - I'm going to be following this litigation very closely in both Virginia and San Francisco. And I will say the San Francisco case is about a robbery that took place in the Sunset District. Shoutout to that district, which is where I grew up. So... 

Dave Bittner: (Laughter). 

Ben Yelin: I'm sorry it's only becoming notorious for a crime. 

Dave Bittner: That's right. It's a bad part of town, Ben. It's a tough neighborhood, you know (laughter)? 

Ben Yelin: I will say that one of the Patty Hearst heists took place in that neighborhood, as well. 

Dave Bittner: Is that right? 

Ben Yelin: So we have a rich history. 

Dave Bittner: (Laughter) OK. I don't know if I've ever actually set foot there, so I'm just, you know, poking fun at you. 

Ben Yelin: It's wonderful. 

Dave Bittner: I'm sure it's a lovely place to grow up, Ben. 

Ben Yelin: Yes, it is. 

Dave Bittner: (Laughter). All right. Well, again, that story is from The Wall Street Journal, written by David Uberti. And we will have a link to it in the show notes. 

Dave Bittner: We would love to hear from you. If you have a question for us, you can call in. We have a call-in number. It's 410-618-3720. You can call in, leave your question, and we may answer it on the air. You can also send us an email. It is caveat@thecyberwire.com. 

Dave Bittner: All right, Ben. I recently had the pleasure of speaking with Max Kirby. He is the leader of cloud solutions at Publicis Groupe. And we covered a lot of ground here, talked about the evolution of the internet, sort of how we got to where we are today and some of the issues that we're dealing with. Here's my conversation with Max Kirby. 

Max Kirby: Up until now, the internet has kind of been one of two things. In the beginning, it was something that was largely based around content that couldn't interact with us. And then we entered this age where the internet is interacting with us. And now it's getting so advanced that we're actually talking to it, and it's talking back, right? It's emulating humans as opposed to us having to emulate machines. 

Max Kirby: Where we're going, as we start to enter what's being called the decentralized web, is a turn away from the advertiser-supported internet that we have today. And that means that private entities that are largely fueling the need to have sites or footprints of any sort - apps, as well - are not going to be funding those purely by advertising. And you're already starting to see this sort of happen. 

Dave Bittner: How are you tracking that? 

Max Kirby: Well, so there's this thing that I think is becoming a buzzword of sorts, but it's worth understanding, called the customer data platform. And you'd think that, you know, if you've been in the web for a while or in this industry in general, you know, obsessed with computers - I guess we could categorize it broadly - you'd think that a CDP, as it's called, is sort of an obvious thing to have. But for a lot of people today who aren't familiar with the web, who didn't grow up necessarily as technologists or data people, their jobs, their livelihoods, their worlds are just now starting to touch how to use customer intent and the data around it. 

Max Kirby: And if you think about Google and Facebook and Amazon and companies of this nature - you know, the big economic titans of today - they've really based their business around collecting all of the data about individuals they can and coming up with different ways to monetize it, you know, as well as exchange value back to those who they're collecting the data from. 

Dave Bittner: What is your take on that? I hear a lot of people sort of describe the decision to make so much of the internet driven by and funded by advertising. It could even be described as being the internet's original sin because so many things came of that, good and bad. But I think in this case, they're talking about the bad. What are your thoughts on that? 

Max Kirby: Well, you see it kind of paying out with the privacy movement today. I think that if advertising was the web's original sin, it might have been something that was necessary to get to this point. If you imagine - you know, go back in time to the dot-com bust. There are all these wonderful ideas about how to use the web. But the problem, of course, was monetization and engaging the kind of business drivers to the technological drivers. Ads made that very close, that connection. It made it visceral and easy to understand. You know, how many clicks can you get was an easy way of understanding it. And it, of course, also shaped our understanding of the web in that direction and away from the other possibilities because it was the thing that worked, right? 

Max Kirby: So if it's the original sin, I think it's going to give birth to a different way of looking at the internet. I'm an optimist about these things, but we haven't figured out exactly what that thing is going to look like yet. And our behavior as users, as customers, as just people - I mean, we're really at the point now where the web is mandatory to survival. You need an email address and a credit card to participate with the economy now. We're never going back to a world where that's not true. 

Max Kirby: That shift in the default - right? - it's no longer a specialist thing; it's a generalist thing - has created a spotlight. And that spotlight is showing up on all the different problems with our evolution of the web as a shared force, if you like, within our society, you know, between companies and governments and people. And it's really a new thing for us to think about it as, what would the web look like if it wasn't purely supported by advertising? 

Dave Bittner: When you consider the spectrum of directions in which we might go from here, what do you suppose is possible? 

Max Kirby: Well, we're running into this pretty often, the question at the micro scale, which is I'm running an enterprise company of some sort or a division within it. I want to start using technology as an advantage. And up until this point, what I'm really just doing is paying for clicks or running ads on a site or trying to get traffic, right? Any of the folks that are running roles like that tend to have that default behavior. It was digital media. And they grew up in digital media, right? A lot of them used to work at AOL, the leaders of those roles today around the market - or companies like AOL. 

Max Kirby: And so now they're asking, what else can we do with it? And some of this is taking the form of subscription-based revenue, where you don't really need a monetizing instrument like advertising or a proxy for the value. You're just paying directly for the value. 

Max Kirby: You know, the news is trying very hard to figure out how to get that to happen, and it's sort of mandatory for their survival as they start to contend with the fact that if you're only running content, then the generation of content can become a commodity and - which is why customers today, they just don't want to pay for news, even though that behavior is totally reasonable. 

Max Kirby: And before the web, what would you do when you wanted to go and get a newspaper? Well, you would probably go to a newsstand, and you would pay for it. I mean, you wouldn't go to a newsstand and say, I want to look at all of this for free, maybe even take it home with me, right? 

Dave Bittner: (Laughter) Right. 

Max Kirby: And so it's this strange thing that the large platforms ingrained in our behavior by going with ads because ads ultimately are a way of capturing user traffic and getting the payment for it post hoc as opposed to at the moment when you receive value, so a lot of people are moving towards subscription. 

Max Kirby: The second category, I'd say, is - in the commerce world - right? - this is where the term direct to consumer is coming up a lot. If you're a consumer product goods company or an auto company or anyone who sort of sells through proxies, be that retailers or dealerships, other companies that distribute your product for you, you've realized that, you know, when - in the world where everything is being described by data, you don't have any because the people who own that last mile with the customer are getting that data about people. And that leaves you as a supplier for someone else's technologically nascent business. And that's the second that they're trying to figure out is, you know, if you're not going subscription-based, you're trying to figure out how to open up a shop on Shopify or go direct to consumer in some meaningful way. 

Max Kirby: And then the third way that we see people figuring out - what else can I do other than get traffic to my site via ads? - is actually the reverse of the proxy conversation. For those in travel or, you know, technology, I'm talking about, like, the carriers or the electronics companies. What they're actually doing is trying to instrument around the customer so you get the connected hotel room or yet another use of the fact that you have a supercomputer in your pocket now and your ISP owns that and the electronics company who sold it to you owns that. 

Max Kirby: And whether it's via bloatware or just the fact that they are, you know, the rails over which those cars run, they're trying to figure out what else they can do with it, and that usually becomes data monetization. So everyone's trying to move away from just pure ads. And the customer data platform conversation looks different in any one of those industries, but they all have the same thing in common. 

Dave Bittner: The organization that you work at is one of the oldest in the world, one of the largest marketing and communications companies in the world. So there's a lot of history there. And I'm curious - how does an organization with that history and of that size and that influence, how do you go about putting guardrails on yourself of, you know, what we're capable of doing versus what we should do of setting rules for ourselves versus navigating regulations? How does that come to be? 

Max Kirby: There's about a million ways to answer that question, but I'll tell you a couple that I think are pertinent to this term called digital transformation - right? - helping our clients become tech companies themselves, right? The first is you need to think about the principles of data privacy, regardless of any of the specific laws that are applicable to your data hegemony. And what I mean there is respecting consent is largely about making the things that you do with data and the intent behind the people who gave you that data line up in some meaningful way. 

Max Kirby: I mean, we're doing some research right now that is showing that people are willing to share data with you, but they want it to be sort of orbiting the value that you would give to them, right? So if you sit down at a restaurant and the waiter comes up to you and says, do you have any dietary preferences at the table or any restrictions, you're probably going to answer that question, right? Because you understand that he needs to know that information in order to bring you food that isn't going to cause any health problems for you. 

Dave Bittner: Right. 

Max Kirby: If that waiter were to come up and ask for your Social Security number... 

Dave Bittner: (Laughter). 

Max Kirby: ...You probably wouldn't because it has nothing to do with the value that you expect from the waitstaff at a restaurant. And that principle is playing out across the entire economy. I mean, it's very easy for retailers to get body sizes from customers. They'll do it. It's near impossible for any financial services institutions to collect meaningful biometric data from anyone, and it's because we sort of understand as a culture that your bank shouldn't necessarily need to know your shoe size for any reason. And there's this quantification of the term creepy that we're observing... 

Dave Bittner: Right. 

Max Kirby: ...Play out. And so what we do to help our clients understand and kind of do the right things with data and also avoid doing the wrong things with data - right? - those two sides of the coin. Data privacy is what you shouldn't do. But of the things that you can do, what should you do? That can be dictated by the information that your customers would expect you to sort of have a right to in order to serve them. But it's about serving them; it's not about becoming a surveillance institution. And so we try to push our clients to think about it that way, which brings up a bigger question about - what about those who are in the business of just collecting data for its own sake? 

Dave Bittner: Right. Well, and to me, I mean, that comes down to the issue of trust, that for the organizations who are trying to do the right thing and handle data responsibly, it only takes a few experiences with users with some of these other organizations who aren't doing the right thing to find out that, for example, my flashlight app is tracking my every move or something like that. And it takes a long time to build that trust back when it's been lost. 

Max Kirby: It does. And I think the thing about data breaches is that everyone understands that there's this effect on your brand. There's this effect on your customers' trust. We don't quite know how long that lasts, how to fix that and - I mean, from a behavioral psychology perspective here, right? But, you know, my colleagues in cybersecurity, the clients I have that are engaged in that, they are constantly trying to prove to executives or peers, the C-suite, that there's value in what they're doing and that their budgets should go a little higher in the stack. And it's only when you don't fund those initiatives or support that in your business that you really pay for it. But the unfortunate thing about cyber is that, just like any function that's constantly trying to sort of justify itself, if you don't have a breach, then you don't notice it. 

Dave Bittner: Right. 

Max Kirby: And when businesses start to collect customer data, if they don't proportionally increase their attention towards security, they're going to eventually become a target because the more customer data that you can collect, the more valuable your business is. And it can be attacked. And once people know about that, that it works, I mean, you're only going to expect more attacks, right? 

Max Kirby: I'll just give you one other example, which is look at the acquisitions that are happening in the market today - Visa tucking in plaid.com as an example. What's going on with Mint? The things that are out there to try to quantify your financial future and help you predict what would, you know, be best for you, but also try to bring you the best credit card offer or whatever you have - those valuations are so high because of the data that resides behind those walls. 

Max Kirby: If you're trying to become a data-as-a-service business or embrace data or collect customer data, you need to proportionally increase your investments in cyber because the value that you are collecting and storing is proportionately more worthwhile to attack, right? And it's no surprise that, you know, the clouds generally have some of the most revered security teams in the market. And one of the reasons why people are adopting cloud so much is because you get that expertise in the cloud to some extent - right? - not withstanding how you actually adopt the cloud, which is a thing we're helping clients do every day, but the storage of data just as such, right? You're renting Azure security team or Google security team or Amazon security team. And the thing is those got so powerful, those security teams, because the data they're collecting is about us as individuals. 

Dave Bittner: Where do you suppose we have to go? When we think about the evolution, the next step, to get where we want to be, what sort of changes do you think are going to be necessary? 

Max Kirby: Well, the first thing I'd tell you is that if you're not centralizing, it's going to be hard to decentralize. What I mean is that if you aren't sort of laddering up in the evolution stages towards centralization of certain assets, you know, namely customer data, then it's going to be very hard when the decentralized web arrives. I mean, things happening on the web on Mastodon or, you know, the home hosting movement, things that techies love to experiment with. You know, give it 20 years, it becomes the way that business is done generally. 

Max Kirby: And so you can't be caught flat-footed, and that means that you also have to understand how to translate to different parts of any organization because not everyone is a technologist, right? I mean, this takes a combined skill set. And I think that most valuable minds in the market today are those who understand technology, understand data, understand security but can also make it very simple for someone who has absolutely no background in the category because that's how you actually push change forward and that's how you become a leader. 

Dave Bittner: All right. Ben, what do you make of it? What do you think - interesting conversation? 

Ben Yelin: Absolutely. I would never say that your conversation was not interesting, but this one... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Was particularly interesting. I mean, I think we have seen the internet evolve. And as he talked about, there are risks to the user from this evolution, particularly when we're talking about subscription services where you're giving websites a bunch of very personal information. It's a successful financial model for the websites, but it's something we sort of do as a force of habit. And I think it's sometimes important to step back and realize how much we're giving up just to purchase these subscription services. 

Dave Bittner: Yeah. It's so interesting to me to sort of think about how much of this is an accident of history. You know, if things had gone a little bit different in a different way, suppose, for example, newspaper subscriptions hadn't been so much of a force to make things free, had we started out with more things being subscription service from the get-go, could our news organizations, would they be in better shape than they are today? You know, if something like Facebook were subscription based rather than being free and ad based, where would we be today? How different would it be? 

Ben Yelin: Well, anything that would make Facebook different than it is now probably would have been better for mankind writ large. 

Dave Bittner: (Laughter). 

Ben Yelin: But you're absolutely right. I mean, I think what newspaper publications were trying to do in an earlier age of the internet is, you know, find a reliable base of online users. And the best way to do that initially was to have free content with advertising, and that's kind of how they sucked us in. 

Dave Bittner: Yeah, a land grab. 

Ben Yelin: It is. And then we all got used to it. You know, I wanted to read more than my three allowable New York Times articles every month, and I purchased a subscription - only to the crossword puzzles, by the way. The rest of it... 

Dave Bittner: (Laughter). 

Ben Yelin: No - but, yeah, I mean, I think it's something that really is the result of a long evolution in how we've used the internet, the expectation of some of these companies and, you know, how we've adapted as circumstances have changed. 

Dave Bittner: And this - some of these messes we find ourselves in today as a result of policy decisions and - as you and I talk about a lot, the kind of - the natural lagging nature of legislation catching up with what's going on in the real world. 

Ben Yelin: Yeah. It really does take forever. I mean, we're always reacting to the problems of three or four years ago. It's a long lag time. And that's what's so frustrating about this field is that there isn't a lot of instant gratification. You can identify a cybersecurity issue. You can identify some sort of policy issue that's violating people's constitutional rights or just an inherent sense of privacy. And it just takes a long time to effectuate change for a lot of different reasons. Yeah. It really can be frustrating. 

Dave Bittner: All right. Well, again, our thanks to Max Kirby for joining us. We appreciate him taking the time and sharing his thoughts with us. That is our show. We want to thank all of you for listening. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.