Twitter's data privacy battle.
Chris Mclellan: Here's a business reason to do the right thing, and you get this - you get the ownership and control of data along with it.
Dave Bittner: Hello everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: Today, Ben discusses a recent Maryland case dealing with particularity requirements for cellphone searches. I've got the story of the Federal Trade Commission suing an online data broker over surveillance concerns. And later in the show, my conversation with Chris McLellan from the Data Collaboration Alliance. We're discussing some hefty fines Twitter is faced with in the battle over data privacy. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: All right, Ben, before we jump into our stories, we have a little bit of follow-up here. A listener named Casey wrote in and said, great show, guys. Questions - once law enforcement obtains data subject to a geofence warrant, for what other purposes, if any, can they use that data - other investigations, general search inquiries? Also, is that data subject to public disclosure, such as in response to a FOIA request? Ben, what do you think?
Ben Yelin: So the answer to all of those questions is generally yes. And I can see why that would be somewhat disturbing. If you have a valid geofence warrant that's signed by a judge, anything obtained pursuant to that warrant is going to be fair game for law enforcement. You don't have to obtain a separate warrant to get information. So you have a geofence warrant, you discover something inherent in that search - maybe XYZ's device was at a particular location at a particular time. At least as it pertains to that information, you don't have to get any sort of separate judicial authorization. Now, if you wanted to search their device, that's a separate question. And you'd need, as we're going to talk about in our next story, a search warrant.
Dave Bittner: OK.
Ben Yelin: In terms of whether this is discoverable via FOIA, there are certain limitations - I am not a FOIA expert by any stretch of the imagination. There are limitations on what qualifies under FOIA that can be released as public information. I think unless this jeopardizes an active criminal investigation or a national security investigation, this is probably something that you could at least get information on the search itself via FOIA. Again, a lot of these geofence warrants are coming from state courts. Every state has their own version of FOIA. So I think we think of FOIA as a federal statute, but many of the cases that pertain to geofence warrants are from state courts. So I don't know every state's rule as it relates to the sensitivity of documents available under Freedom of Information requests...
Dave Bittner: Come on, Ben.
Ben Yelin: I know, I know.
Dave Bittner: (Laughter).
Ben Yelin: I got to do my - some more half-assed...
Dave Bittner: Slacker.
Ben Yelin: ...internet research.
Dave Bittner: Yeah (laughter).
Ben Yelin: But, yeah, I do think, at least the general information about the search itself would be discoverable under FOIA, unless it falls under some sort of public safety, national security exception.
Dave Bittner: All right. Well, thank you, Casey, for writing in. We would love to hear from you. You can email us. It's caveat@thecyberwire.com. All right, Ben, let's jump into our stories, here. Why don't you kick things off for us?
Ben Yelin: So my story comes from a Twitter account that covers Maryland appellate court cases. Talk about a niche interest.
Dave Bittner: (Laughter).
Ben Yelin: But, certainly for me, I'm a - I'm an active follower.
Dave Bittner: You're on the edge of your seat (laughter).
Ben Yelin: Exactly. So I was very intrigued by a case that came out this past week called Richardson v. State of Maryland. So Mr. Richardson was caught in a fight - a brawl that broke out in the back of a school in rural Maryland. And this is a state of Maryland court case in Maryland State Court. So he got into a brawl. Officers got involved - public school safety officers. He had a backpack, and he threw his backpack to the ground in the course of this fight. He tried to dive for the backpack. Law enforcement dove for the backpack, as well. And Mr. Richardson, unfortunately for him, lost out to law enforcement. So they had the backpack and he didn't. He decided to flee the scene, probably a wise decision. He left without his backpack, and he escaped from police for the time being.
Ben Yelin: So law enforcement, incident to what would have been an arrest - so they didn't get the chance to arrest him, but they would have arrested him - searched his backpack and discovered that there were three cellphones in there, along with a firearm. And they had reason to believe that Mr. Richardson had been involved in a previous armed robbery. So they went to the Maryland Circuit Court and got - applied for and obtained a warrant for literally everything on his cellphone. And I'll read the language of what was going to be seized here 'cause it gets to the breadth of what the - what law enforcement was looking for and what, at least the lower court, allowed them to look for. So the warrant authorized the officers to search for, quote, "all information, text messages, emails, phone calls, incoming and outgoing, pictures, videos, cell site location, data and/or applications, geotagging, metadata, contacts, emails, voicemails, oral and/or written communication and any other data stored or maintained inside the phone." So that's pretty much everything.
Dave Bittner: The whole enchilada (laughter).
Ben Yelin: Yeah, pretty much everything besides your notes application.
Dave Bittner: OK.
Ben Yelin: And I'm sure they could have gotten access to that, too.
Dave Bittner: Right.
Ben Yelin: If they wanted to know your Panera orders, that would have been there for the taking.
Dave Bittner: (Laughter) Right.
Ben Yelin: There was also no temporal limitation on the search. So this was any of those types of communications that I just mentioned, going back, basically, in perpetuity.
Dave Bittner: Right.
Ben Yelin: You could collect any of them. So the Maryland Court of Special Appeals, the intermediate court, upheld the constitutionality of the search, and Mr. Richardson appealed to the court of appeals, which is the highest court in Maryland. And they said that this type of search violates the Fourth Amendment and equivalent case law here in Maryland. The Fourth Amendment says that in order for there to be a constitutional search - in order for there to be a reasonable search, you have to allege some type of particular thing that you are going to search or seize. So the reason we have a Fourth Amendment is because - not to get too historical here - in the British colonies, they used to do these things called general - they used to have these things called general warrants...
Dave Bittner: Right.
Ben Yelin: ...Where a king would send his minions to gather any contraband that they could find in somebody's house.
Dave Bittner: So like a fishing expedition.
Ben Yelin: Exactly.
Dave Bittner: Yeah.
Ben Yelin: And that was very offensive to our founding fathers. In the colonies, those were referred to as writs of assistance, and they were very disfavored. It's one of the reasons why our founding fathers came up with the Fourth Amendment. So we have this - as a result, we have this particularity requirement. You have to describe either the things that are going to be seized or the person that's going to be searched. And this, according to the Maryland Court of Appeals, does not allege any sort of particularity. In fact, it is as far from particularity as you could possibly find. You - at least in Maryland, according to have a valid constitutional search of somebody's cell phone, while obtaining a warrant, you have to actually describe specific information that you're seeking. So Richardson's whereabouts on the night of X or his text message communications with so-and-so about this particular event.
Dave Bittner: Right.
Ben Yelin: It can't just be everything on the device.
Dave Bittner: So they're - if they're interested in him for a prior alleged crime, then they would make the case to the judge that we want to - we're looking for information related to this crime.
Ben Yelin: And I think that's basically what they did here. There just was no limiting principle. I'm not sure how they could have crafted an application, frankly, without being this vague 'cause if you're looking for just information about a crime that happened in the past, on a cell phone, that could be anywhere. I mean, it could be in a TikTok or Snapchat.
Dave Bittner: Yeah.
Ben Yelin: We don't know how the kids communicate these days.
Dave Bittner: (Laughter).
Ben Yelin: So I really do think that this might make life a little more difficult for law enforcement 'cause they might not know exactly where to look on a device for information about a specific crime. But at least in Maryland, that's what they're going to have to do. They're going to have to at least have some inkling of where that information might be or just some type of limiting principle, so a limited duration or limited only to text messages or voicemails, etc.
Dave Bittner: Let me ask you this.
Ben Yelin: Sure.
Dave Bittner: Does this principle extend to the real world? In other words, if I got a search warrant to search this person's home, would I have to say, I want - we want to search the safe that's in the upstairs bedroom under the bed, or can we just ransack the place?
Ben Yelin: You can ransack the place, but you have to describe, in order to obtain a search warrant, what you're looking for. So we are looking for drugs.
Dave Bittner: Isn't that what they did here on his phone, though?
Ben Yelin: Yes. But you're also - there are some other limiting principles related to even a home search. You're just searching one home.
Dave Bittner: OK.
Ben Yelin: And the duration just naturally is limited. So it's not like you're going to have officers going in and out for - 24/7 or obtaining surveillance footage of everybody going in and out of the house and everything that happened in that house over an extended period of time. It's limited to one instance.
Dave Bittner: OK.
Ben Yelin: So we don't really see this as much in the physical world, and it gets to the nature of cell phones. They quote the Riley case, Riley v. California, where Chief Justice Roberts says, once again, cell phones are special.
Dave Bittner: Yeah.
Ben Yelin: They contain multitudes of our personal relationships, our religious, political affiliations. They are more than just a device, and I think that's the philosophy the court is taking here. The hilarious kicker to this story is it actually does not provide any relief for Mr. Richardson himself, and that's because of the good faith exception. So the good faith exception basically says if law enforcement are acting under a good faith interpretation of what they assume the law was at the time that they conducted the search, then anything obtained from that search is admissible in court and can be used against a criminal defendant. So basically what happened here is they did find incriminating information on Richardson. Even though, as a principle, the court is saying you can't have a search this broad of a cell phone with no level of particularity, it's OK in this case because law enforcement was acting in good faith. We hadn't issued this decision yet, so they didn't know what the status of the law was in Maryland. So you can imagine this whole case is kind of...
Dave Bittner: It kind of loops back on itself, doesn't it?
Ben Yelin: Yeah, I mean, it's just - poor Mr. Richardson. It's like, this applies to everybody after you, but for your particular search, you're still screwed, and you are going to prison.
Dave Bittner: Right. But thanks (laughter).
Ben Yelin: Yeah, exactly. I'm sure he's very pleased that his case will be foundational in Fourth Amendment jurisprudence in Maryland.
Dave Bittner: Wow.
Ben Yelin: But he himself is going to be locked up.
Dave Bittner: Wow.
Ben Yelin: But I do think it's just a really interesting case. I mean, cell phones are such an enigma in all types of Fourth Amendment constitutional cases because they're very hard to pin down. They're not just a thing. It's not just a piece of paper in your house. And it's almost more just part of your body. I mean, it's so foundational to who we are and how we operate in modern society.
Dave Bittner: Right.
Ben Yelin: And I think that's exactly what the court was getting at in this opinion. And for future cases in Maryland, you're not going to be able to just say give me the phone and let me search everything and see if I can find some incriminating information, even if I'm searching from text message archives from 2015. I - that's not going to fly in Maryland anymore.
Dave Bittner: What about at the federal level? I mean, is this an area where we're still looking for some clarity, or does the Riley decision provide that?
Ben Yelin: So the Riley decision simply says that the government has to obtain a warrant to search a cell phone. They can't simply seize a cell phone and search it incident to arrest without obtaining a separate warrant. Here, in this case, they actually obtained a warrant. So Riley is controlling in the sense that they did have to obtain a warrant. But this is about what has to be contained in the warrant itself.
Dave Bittner: I see.
Ben Yelin: This is only applicable right now in the state of Maryland, but it could be persuasive to other state court systems and to federal courts about coming up with some sort of limiting principle related to the particularity requirement in these types of searches. So judges are always speaking to one another in their judicial decisions. And what that means is some judge in another state is going - if they get a case like this, they're going to look at the lay of the land and see what other states have done, what other state courts have done. And so now we have what seems to be a pretty foundational case in this area of the law. And it could be something that state courts, and eventually federal courts, come to adopt. If you are a criminal in a state that is not Maryland, I would...
Dave Bittner: So present company excluded (laughter).
Ben Yelin: Yeah, exactly. So me and you are going to have to be a little bit...
Dave Bittner: (Laughter).
Ben Yelin: ...More careful. For everybody else in the other 49 states, it still means that law enforcement theoretically could obtain a warrant and search anything on your phone. It's - this is really limited to the one jurisdiction in which it was decided.
Dave Bittner: So let me ask you this, I mean, as sort of a side question for my own curiosity. Are there U.S. states that are more influential when it comes to these sorts of things than others? Do the feds look at whatever - the population, the professionalism, the history of - you know, does California have more sway than Mississippi, or does it come into play at all?
Ben Yelin: It does. There are certain courts - just by their nature, by their history, are looked at as kind of carrying an extra - we talked about the Delaware Chancery Courts in a previous episode as it relates to disputes among large corporations. That's one example of a court that is a greater among equals in terms of its peer states. In terms of state court systems, I'm not necessarily sure that Maryland state court system is revered above its peer states. I will note - and this might just be a coincidence, it might be our proximity to D.C., it might be that we have a lot of criminals, a lot of...
Dave Bittner: (Laughter).
Ben Yelin: A lot of crime shows are based in Baltimore.
Dave Bittner: Yeah.
Ben Yelin: So that might have something to do with it. A lot of the surveillance law cases - these types of cases have come through the Maryland court system, a disproportionate amount. It's something I've noticed in just doing research. And I don't know if there's a good reason for it. I don't want to disparage any of my Maryland colleagues. I'm sure they're great judges, great attorneys, they're writing great briefs and great opinions. But it could just be a coincidence that Maryland has kind of taken a lead in jurisprudence on surveillance issues.
Dave Bittner: Yeah. Interesting. All right, well, we will have a link to that in the show notes there. My story this week - interesting development. I'm linking to a story from The New York Times by Natasha Singer. And this is about how the FTC, the Federal Trade Commission, has sued an organization over their tracking data that they say - the FTC claims - could expose folks who are doing things like visiting abortion clinics, visiting health care places, homeless shelters, that sort of thing. The FTC has sued a company called Kochava, which is a data broker. And they're saying that the company's sale of geolocation information on tens of millions of smartphones could expose people's private visits to places like abortion clinics and domestic violence shelters. This covers something that we've talked about here plenty of times. I mean, we talk about how, you know, if you're tracking someone's individual mobile device, the chances are, if I know where you live and where you work, I can nail down what device is yours.
Ben Yelin: Right, exactly.
Dave Bittner: And once I've done that, that's the ballgame. I can track - if I have access to that data, your location data, I can pretty much track wherever you go and know that it's you with a high degree of certainty. And so the FTC is going after this company, claiming that they make it too easy for people to do exactly that. Can we unpack this here? I mean, 'cause - start with some basics here. Why the FTC?
Ben Yelin: So the FTC has jurisdiction over consumer protection issues, and this is related to consumer data privacy. And the enforcement arm is the FTC. Additionally, President Biden - I think sometime in July after the Dobbs decision - drafted an executive order calling for a federal crackdown on companies such as this one that have loose privacy protocols related to reproductive rights. So sensitive health data, anything that involves digital surveillance related to reproductive health care services - the executive order is intended to hold these companies accountable that aren't particularly strict with this information and aren't protecting consumer data. And the FTC has jurisdiction because this is a consumer protection issue.
Dave Bittner: And so the FTC filed suit against this company. How does that play out? What does that actually mean? What do they have the power to do?
Ben Yelin: So the FTC can do a number of things. They are empowered to issue civil sanctions. In some circumstances, they could issue criminal penalties. Usually, there is some sort of negotiation between the FTC and the company for some sort of equitable solution where the company would pay a fine for having these types of deceptive consumer practices or practices that made the sensitive consumer data available. So they pay a fine over it, they promise to take certain proactive steps to protect consumer data, stop what they're doing, and that would be part of some sort of settlement. It depends on how much this company, which I believe is based in Idaho...
Dave Bittner: Yeah.
Ben Yelin: ...And is somewhat of a smaller company compared to some of the big tech cases we've seen - I don't know how small it is, but it's not one of the Big Five, for example. It is a digital marketing and analytics firm. But it depends on how much, really, they want to contest this issue if and when it makes it into court. The FTC has a lot of enforcement tools at its disposal, and by initiating this lawsuit, they're showing their seriousness and how much they want to protect this data. And we're not talking about an insignificant number of records here. I think according to the FTC complaint, they obtained something like 61 million unique location - or user location data tags...
Dave Bittner: Right.
Ben Yelin: ...Basically.
Dave Bittner: Right.
Ben Yelin: So we're talking about a real hefty amount of potentially sensitive data. So I think it's the FTC bringing this to court saying if you don't ameliorate these practices of being a broker for the sensitive data, then we will bring the full force of the federal government to enforce our laws, to protect our consumers.
Dave Bittner: Now, this article points out that Kochava filed a preemptive lawsuit against the FTC earlier this month.
Ben Yelin: No, I sue you.
Dave Bittner: (Laughter).
Ben Yelin: You're not suing me. I'm suing you.
Dave Bittner: Yeah, that's right. You can't fire me 'cause I quit.
Ben Yelin: I quit. Yeah.
Dave Bittner: (Laughter) And so they're saying that they've done nothing wrong, that they're - they've complied with all the laws. They say that the location data comes from third-party information brokers who say consumers consented to the data collection. Well, of course they did. I mean, that's - I mean, that's the EULA, right? That's...
Ben Yelin: Right.
Dave Bittner: ...Somebody using a flashlight app and having their...
Ben Yelin: Agree.
Dave Bittner: ...Location tracked.
Ben Yelin: Yeah, exactly.
Dave Bittner: Yeah.
Ben Yelin: Yeah, we see this frequently. There are suits and countersuits. Sometimes you sue somebody in anticipation that they're going to sue you. Not to get too much into civil procedure, which bores everybody except a very...
Dave Bittner: Except you (laughter)?
Ben Yelin: ...Nerdy bunch of law professors...
Dave Bittner: Right?
Ben Yelin: ...But those claims are consolidated into a single case. So if it's...
Dave Bittner: I see.
Ben Yelin: ...The same parties, and the claim relates to the same set of facts or incidents, it'll be one case. But there could potentially be a counterclaim against the FTC from this company involved in the case. So at least theoretically, the company could get some type of relief from the FTC for, I don't know, harassing them or subjecting them to unjust civil or criminal penalties, something like that. But I think more likely, it's the FTC that would win some type of injunction to force this company to improve its practices or to levy some type of fine on them to make them comply.
Dave Bittner: So how does this play out on the larger stage of privacy protection here? Suppose the FTC is successful. Is that just something that informs the legislation that could come along, or does this - is it something that Supreme Court justices would consider in their own, you know, thinking about these sorts of things? Where does it fit in?
Ben Yelin: Good question. I don't think it necessarily would change something like the Supreme Court. It's just - it is a little bit of a niche consumer protection issue, but it doesn't really get into constitutional claims. There are going to be constitutional claims on this subject, where somebody is prosecuted for getting an abortion and whatever digital surveillance was used - data brokerage, whatever - to gain evidence for that prosecution, there might be a Fourth Amendment claim.
Dave Bittner: I see.
Ben Yelin: That's not what's going on here. This could inform Congress. If Congress believes that the FTC is setting some type of new standard here, Congress, in its infinite wisdom, might decide to codify that into law so that a future FTC that wasn't so gung-ho about this type of enforcement wouldn't completely drop the ball and say, yeah, we don't care. We don't have to. I know you like to quote Lily Tomlin from that great...
Dave Bittner: (Laughter).
Ben Yelin: ..."Saturday Night Live" sketch.
Dave Bittner: Right, right.
Ben Yelin: I know you're a fan of that, so...
Dave Bittner: (Laughter) Right, right.
Ben Yelin: So I think Congress could use this as a jumping-off point for their own data privacy legislation. Now, the timing is a little tough here 'cause these proceedings between the FTC and this company could last a long time. You and I have talked about this with a number of our guests. I mean, there's a real effort by the end of this calendar year to get some type of federal data privacy law enacted. So I'm not sure if this decision will play out in that legislation, assuming that something gets passed this year. But yeah, I mean, this is something if I cared about, particularly about digital privacy related to reproductive rights, and I saw that the FTC was initiating this enforcement, if I cared about this issue, I would try to codify whatever the FTC says is the violation into a statute so that a future FTC run by commissioners of one's political enemies doesn't immediately reverse and say, actually, this type of data collection is acceptable.
Dave Bittner: I see. Wow. All right. Well, we will have a link to that story in the show notes as well. Of course, we would love to hear from you. If you have something you'd like us to discuss on the show, please do email us. It's caveat@thecyberwire.com.
Dave Bittner: Ben, I recently had the pleasure of speaking with Chris McClellan. He is the director of operations at a nonprofit called the Data Collaboration Alliance. And our conversation centers on the recent $150 million settlement between Twitter and the FTC - FTC heavy show today.
Ben Yelin: Yeah.
Dave Bittner: Resolve in...
Ben Yelin: They didn't pay us (ph).
Dave Bittner: ...Charges that Twitter misused personal data over a period of about six years to help sell targeted ads. Here's my conversation with Chris McLellan.
Chris Mclellan: The Data Collaboration Alliance is a registered nonprofit organization. It's based in Canada, but it has a global outlook. And its mission is really to advance two things simultaneously. The first is to support the data management technologies, protocols, frameworks and standards that help organizations and people get control of data. And that's our primary objective. Right now, data is far from controlled for reasons we can probably get into. But as far as a mission goes, control is the first. But control is not the endgame of our mission. The control is just the starting point. What control leads to is, in our world view, collaboration. Once people have agency and ownership over their information and organizations have agency and ownership of their information, it will lead to more and better and more frequent collaboration on data, on data that is largely uncopied.
Chris Mclellan: In principle, the way we approach data management and what needs to be done in that realm is similar to how societies already do things to protect things like money and intellectual property in that we make it difficult to copy those things in order to preserve their value. So again, our world view is that if we all agree that data has value, both personal and organizational data, then maybe we ought to stop copying it. And so that leads to, like I said, the - our support and our advocacy for new standards, protocols and technologies that either very much reduce or minimize the use of data or even eliminate copies altogether.
Dave Bittner: You know, I think it's fair to say that a lot of people have a certain sense of resignation when it comes to how their data is being distributed and shared and, you know, the various EULAs that we have to click through to get to the services that we want. What sort of things are you advocating here to help us get better control of this?
Chris Mclellan: Yeah, well, you're absolutely right in saying that. The situation is deeply asymmetrical right now. As an end user of an application, you log in, and once you do that, you'll be asked for a consent - to sign a consent form or a privacy policy, and then off you go. But once you do that, the organization or the developer or whoever owns the app is really in the driver's seat with regard to the data you either contribute or passively contribute through your activities, such as clicking on buttons and on images and videos and stuff like that. At that point, you have very little control over your information, and the owner of the app has all the control, including the ability, oftentimes, to sell it on to third parties or to data aggregators. And so that's - just on even, you know, I think, on surface, that seems like a very unfair equation. And I often describe that as a form of data cooperation but not collaboration. Cooperation means you are working together but for different goals. Collaboration means you're working together on the same goal.
Chris Mclellan: So when we think about something like Facebook or Twitter, yes, we're cooperating on data. I'm giving them my profile information and my clicks and my likes and my content that I post. But their objective as a business is very different. It's to sell advertising. And so, sure, from a data point of view, we're cooperating, but we're not collaborating.
Chris Mclellan: So imagine if that situation was turned on its head where I had full control of the information I contributed to the app, and I was granting the app provider, the Twitter, the Facebook, for example, access to it. But I would be able to choose when, where and how I would grant that access. And that access could be for things like, oh, I don't mind, you know - I'll grant access to my information to share information with my friends or discover new friends or something like that. But I don't want to grant you access to package my data to sell to advertisers, or I don't want to give you my nonanonymized data to advertisers. So that's the sort of situation where, when you move from cooperation to collaboration, the power dynamic readdresses itself to what I think most would agree is, is closer to a just situation with regard to who's putting what into the situation.
Dave Bittner: And what sort of things would have to be put in place to have something like that come to pass?
Chris Mclellan: Yeah. Well, I can tell you quite readily what needs to go away.
Dave Bittner: (Laughter).
Chris Mclellan: And what needs to go away - what's the enemy? So what we're talking about here, really, like I said at the outset, the mission of the Alliance, the first is control. And that is a precondition of meaningful collaboration. You can't collaborate on fragmented things, right? So what fragments data? Well, there's two things that fragment data, and this is within any and every application in the world. The first is data silos. Like, there's an app for everything and a database for every app. And so that creates a situation where I say a big bank or something might have thousands of individual databases containing information.
Chris Mclellan: So data, firstly, is fragmented, and it's fragmented because we have - every time we want to build a new application, we create a new - it contains its own data. And that's sometimes referred to as a data silo. And so that's the first problem. The second problem is apps depend on data from other apps. And so what happens is the data from any given app needs to - is copied, often thousands of times a day, routinely and very much under the radar of most human eyes and that sort of thing because it's done through a process known as data integration.
Chris Mclellan: So data integration is really a way of making copies of data between those silos in order to, you know, achieve outcomes. Like I said, like, the truth is that new apps require data from old apps to function. Other times, you want to make copies to aggregate data for purposes of, like, analytics and stuff like that. So the two problems here with regard to control are silos and copies, and silos and copies are kind of how you would define IT landscapes today.
Chris Mclellan: And I think it's important to point out this isn't necessarily or - you know, done because of evil. This is just how technology has evolved. Oracle deployed the first - or made available the first relational database in around 1979. Few years later, we started to build applications in this way. And, you know, we - there's no question that we all love apps. Apps solve life's little problems. They solve business challenges. So apps aren't going away. But how we build apps needs to fundamentally change if we're to introduce that control I'm talking about.
Dave Bittner: Well, let's talk about a specific case. There was the recent case where Twitter was fined by the FTC or they agreed to a $150 million settlement. They were misusing some personal data from users. What was going on here?
Chris Mclellan: Well, there's a couple of things to unpack here. I guess one is just the very simple answer is that people were - users of Twitter were misled in terms of the terms and conditions and policies that stated how their data, their phone numbers and email addresses were going to be used. On surface, or at least as far as anyone was concerned or able to determine, it was for operational reasons, but in fact, it was actually also going that that data was also going to be used for - to supply advertisers with their information so they could be targeted with ads on the platform and possibly elsewhere. So it was sort of - I don't want to call it a bait-and-switch situation, but there was a serious omission of intent of the use of the data that they were collecting.
Chris Mclellan: And to some extent, that's unforgivable because it's people's phone numbers and email addresses. But I think it takes the focus off the real issue, which we at the Data Collaboration Alliance focus upon, which is this - so imagine that - the same situation, that Twitter had got its act together and the policy clearly stated to people that their information was going to be packaged and provided to and sold to data aggregators for advertising purposes. OK. And let's say some people opted out of that and said, no, I don't want you to do that. Even in that situation, the people that opted out of that - we come back to control. So does that mean that the people who opted out of that, their data would never surface to places they don't want it to? And the answer to that is really no because just the way complex technology ecosystems in companies as big as Twitter work, as I mentioned earlier, the fact is that data is all copied routinely between databases within Twitter, but also they're part of a bigger data ecosystem of advertisers and partners and suppliers.
Chris Mclellan: And data gets integrated, particularly sensitive and customer data because in a consumer environment or app, customers are kind of at the center of the universe. That data gets copied all the time. And so the - there's an illusion of control that people might have that - or think that companies like Twitter have when, in fact, they don't. And so I often describe consent forms and privacy policies as largely meaningless for that reason.
Dave Bittner: Do you suppose a settlement like this, $150 million, is that something that gets the attention of some of these big platform providers or other folks who are dealing with people's data?
Chris Mclellan: I don't think so. I stopped tracking fines out of Europe under the GDPR regulations about a year ago. I mean, how big is - what's the number we're waiting for that's going to change things - half a billion dollars to Facebook in Ireland? You know, it hasn't changed anything. And frankly, these companies can afford to pay these fines and absorb them as a cost of doing business. And I think the real issue with fines isn't that they're pretty ineffective against the biggest, you know, organizations in the world who can afford to pay them but more so that they send a very problematic signal to the innovators of the world, the startups, the companies that are trying to solve problems in health care and in civil society and - you name it - sustainability. Do we really want an innovation economy that's, you know, based on a playing field of fear, of fines and breaches rather than one of supportive and - that is really encouraging creative problem solving?
Chris Mclellan: And so I don't think fines are the answer. I think what the answer is is encouraging the use of new standards, protocols and technologies that help address the root causes of data insecurity in the first place, which is, again, at the Alliance, we've identified as silos and copies. And I'm sure we'll be able to talk a little bit about some of the technologies and frameworks that are addressing that. But in my opinion, fines won't change the biggest of companies, and I stopped paying attention to them some time ago.
Dave Bittner: So do we suppose that a regulatory regime is in order here? I mean, are we looking for, for example, something in the U.S. at the federal level, you know, a GDPR for us?
Chris Mclellan: Yeah. Well, I think, absolutely, that some of the ongoing work at the federal level for a national privacy law seems to gain - have momentum again. But we have been here before, so it's a little bit wait and see. There does seem to be a bit more wind in the sails this time. Obviously, a lot of states in the U.S. aren't waiting for that and haven't, such as California, but increasingly others - Utah was recent, Virginia, Maine, you know. So they're - these aren't - I wouldn't call them all GDPR equivalent. I would call California's pretty darn close. But the states are starting to take their own initiative on these things.
Chris Mclellan: And so is regulation the answer? It's part of it. But I guess from my side, I'm more excited about the potential for new technologies, new frameworks, new protocols to create a win-win situation. So I'll give you an example. One of the frameworks that we support is called zero-copy integration. And that word zero copy kind of says what it - you know, gives you a hint at what it's all about, which is the ability to build new applications from a platform instead of a database that can support hundreds of applications from the same data and not make copies of it or fragment it.
Chris Mclellan: What that means is the controls are - can be - the controls you set on who can see the data and do what with it are very easy to enforce universally. You're not having to change those controls app by app by app by app. And that gives you a sort of an idea of what the future holds for data ownership. It's building apps differently. And zero-copy integration, which is now in a 60-day public consultation period in Canada - meaning that in about 60 days, it's poised to become a national standard - what it does is provide these sort of six rules in a framework that developers can observe to not only provide - stop making copies of data and stop fragmenting it in silos but to give all data stakeholders, end users, partners, you name it, real agency and control of the information by not making copies of it.
Chris Mclellan: And the carrot here is that it's a way faster and more efficient way to build new technologies than the old, fragmented, siloed way. So it's just better for business. And that's the sort of standard that should get people excited because it's not all stick. It's not a fine. It's like, here's a business reason to do the right thing. And you get this - you get the ownership and control of data along with it.
Dave Bittner: You know, it kind of reminds me of this - I don't know - ongoing fantasy that I've had when it comes to my medical data, which is, wouldn't it be great if I could fill it out once, and every time I go to a different doctor or a different specialist just, you know, point to that database. Here's me. Here's my medical history. And, you know, I can authorize you to take what you need and maybe restrict you. If, you know, if I'm going to the urologist, that's a different thing than going to the allergist or - you know. Is that along the lines of what we're talking about here?
Chris Mclellan: Definitely. Now, there's different scenarios on how that can play out. But ownership, like - so I always say to people, it's about control. And so long as you have control of something, do you care if you own it? I mean, do you care if your personal information is in a - say, kept in a hospital or in your - on your smartphone so long as you have control of it? And so I think what we'll see in the future is what will play out as people, you know, are encouraged through regulations and laws, but also through just some efficiency outcomes, to adopt frameworks like zero-copy integration, that you'll see startups form, you know, and companies form and data be managed in different ways.
Chris Mclellan: So one could be a data wallet. You know, what you're describing could reside on your phone, and that's got all your core medical information, and you could grant access to that to a medical service provider or medical insurer. That's one way it could work. Another way is that, you know, that medical insurer adopts the framework I was describing - zero-copy integration - to build apps where control is possible. Data is not copied. Data is not fragmented. So therefore, when they - they could grant - pass those controls on to you.
Chris Mclellan: And let's - and so your data is physically in their environment, but you're in control of it. And that's - so long as you have control - I think someone mentioned to me the other day, would - you know, if you gave your child your car - loaned the car, and a fender-bender occurred, do you care that you own the car or lost control of it at that point? You know, you'd rather have control than ownership. So this is really about control.
Chris Mclellan: Where the data actually physically resides, I think, can be misleading. But sure, I can imagine decentralized approaches such as advocated by blockchain as well as others that are more centralized, advocated by technologies like dataware and data fabric technology. But at the end of the day, if it delivers control and it supports developers in their ability to solve problems, then it should be OK. And there - though I mentioned there are a couple of technologies that are helping moving towards that end for sure.
Dave Bittner: Is there a particular timeline that you and your colleagues there at the Data Collaboration Alliance feel like we should be on here?
Chris Mclellan: That's a really good question. There's different - depends who you are, in some respects. I mean, I'm of a certain age, and my data's out there, and I was an early adopter of digital technology. So...
Dave Bittner: Right.
Chris Mclellan: ...When it comes to control and ownership of my information, I think the horse has left the barn. It's out there, and I can find it. And I know it's out there, and I've let it be out there to a lot of extent. I think it's much more poignant when you start to talk about children and children's data, and I think we're seeing a lot of negative outcomes of people misusing children's data on social media in particular. So the urgency, I think, is around - as apps become embedded in the lives of our children, sometimes as young as 3 or 4 years old, I think the urgency has been with us for some years now that we need to figure this stuff out. Otherwise, we're really enabling a whole generation to be targeted with weaponized information for the rest of their - for their entire lives. And I don't think that's a situation any of us want to see or support.
Chris Mclellan: So it's an interconnected world. So it doesn't take everyone to jump. It just takes important commercial entities, like the European Union, the United States or even just California, which, let's remind everyone, has an economy - I think the 10th biggest in the world or something. In an interconnected world where data knows no boundaries, it just takes a few of these jurisdictions to put in truly meaningful data protection legislation to change a lot of the world 'cause people aren't going to want to not do business in California or Europe. So that's all very interesting and happening, but the timeline to me is yesterday, I suppose.
Dave Bittner: Ben, what do you think?
Ben Yelin: Twitter did something wrong? I'm shocked.
Dave Bittner: (Laughter).
Ben Yelin: I thought it was a really interesting conversation. One thing that struck out to me earlier is changing the way we look at these types of cases. We've looked at them in that it's the consumer accepting the terms and conditions in the EULA, and, really, all the power is in the hands of the companies collecting the data 'cause they're the ones who write the rules, and we're just the ones clicking accept.
Ben Yelin: I think what the guest is talking about is reorienting that relationship so it's the consumer that has more active control on the data that he or she wants released. It's kind of a nudge approach where, if you just switch incentive structures, that might be the most simple way to put more power in consumer hands, just by making it consumer-centric and not about what the company wants to protect and what they're trying to sneak by you through some of these EULAs. So I thought that was a really interesting element of the conversation.
Dave Bittner: Yeah. All right. Well, again, our thanks to Chris McLellan. He is from the Data Collaboration Alliance. We do appreciate him taking the time.
Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.