Caveat 2.10.22
Ep 112 | 2.10.22

Developer challenges in privacy preserving.

Transcript

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben has the story of the IRS backing off on their plans to require facial scans for ID. I have the story of the national security implications of the open data market. And later in the show, my conversation with Patricia Thaine. She's the CEO of Private AI. We're going to be discussing protecting consumer privacy in applications. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's jump in with some stories here. Why don't you kick things off for us? 

Ben Yelin: So I'm going to do something a little rare today and start with a personal anecdote. 

Dave Bittner: Ooh (laughter). 

Ben Yelin: But I swear it's going to relate to an actual news story. 

Dave Bittner: All right. 

Ben Yelin: So I got a little slip of paper in the mail probably a month ago telling me how much I got in child tax credits in the tax year of 2021. 

Dave Bittner: Yep. 

Ben Yelin: I have two kids, you know, so it was a nice little amount. Of course, nobody keeps those pieces of paper. And when it was time to actually start doing my taxes, I realized I probably recycled it, or it's in a giant pile. 

Dave Bittner: Right. 

Ben Yelin: So I decided to go to the IRS website, assuming, you know, if I just entered a little bit of personal information - it was probably secure - you know, they could give me that number, and I could put it in the right box in TurboTax or whatever. 

Dave Bittner: Yeah. 

Ben Yelin: It turns out, to get that information, you have to do a number of, I would say, rather insane things... 

Dave Bittner: (Laughter). 

Ben Yelin: ...To protect your own personal security. And one of those is to upload your face through a program called ID.me. That's actually the name of the company. So the way it works is you upload a photo of your driver's license, and then you take a short two- to three-second video on your smartphone and upload that. And using biometric data, they match your face in the recorded video with your driver's license photo. And that is the only way that you can access critical pieces of information that you might need to do your taxes. 

Ben Yelin: When seeing that this was the option, I decided to absolutely not do that because of the security concerns. 

Dave Bittner: Yeah. 

Ben Yelin: And I apparently was not the only one. So I saw an article in The Washington Post entitled "IRS Abandons Facial Recognition Plan After Firestorm of Criticism." So the IRS had bought millions of dollars' worth of technology from this company ID.me to do this personal verification through biometric data where somebody submits that video selfie and the driver's license to get access to their tax records. 

Ben Yelin: There was a huge backlash, including a letter from our - probably the most mentioned United States senator on our podcast, Ron Wyden, along with two other members - two dozen other members of Congress of both parties saying this is absurd. People should not need to submit biometric data in order to get basic information on their tax returns. You can balance trying to secure that data while also not forcing people to do something like upload a video selfie that potentially could present privacy concerns. 

Dave Bittner: Yeah. 

Ben Yelin: So the IRS has abandoned that policy. They're going to use different types of authentication going forward. We don't have full details on exactly what that's going to look like. They're going to phase out ID.me. And, you know, we don't really know what's going to happen with the millions of dollars we spent on the software, nor what's going to happen to people who have already submitted their beautiful driver's license photos and video selfies, what's going to happen with that data. 

Ben Yelin: So I think this is kind of an apocryphal tale. The technology seemed too good to resist. The IRS had proper motivation. They wanted to protect user data, wanted to make sure that the only people accessing your IRS data, your tax returns - the only person accessing this - that is you yourself. But I think it seems like they're admitting that they went a step too far here. 

Dave Bittner: Yeah. I mean, a couple of things come to mind here. First of all, obviously, the IRS online website is a huge target for scammers. 

Ben Yelin: For good reason, yeah. There's a lot of very sensitive information in there. 

Dave Bittner: Right. And, of course, people want to try to access - you know, to get your refund. And it's funny. Joe Carrigan and I were talking about this recently on "Hacking Humans." We actually had a listener write in and sent us a copy of a scam that they'd received - a scam attempt. And part of the request of the scammers was both a photo and a video of the person's face, just for this type of thing. 

Ben Yelin: Right. When you - when the scammers are doing it, you know that that information is valuable. And that's, I think, reflected in people who actually know what they're talking about on this issue. In this article from The Washington Post, somebody from the Surveillance Technology Oversight Project was quoted as saying, "when government agencies use this technology, it's a question of when, not if, this biometric data is hacked, leaked or misused." The data's so valuable and there are enough bad actors out there that there is sort of an inevitability behind it. 

Dave Bittner: Yeah. I mean, it seems to - I guess I'll be interested to see what they come up with as their alternative because I think it's reasonable with something as important as your tax information to require some kind of second factor here. But, yeah, it's interesting that people found that actual facial scans was a bridge too far. 

Ben Yelin: Yeah, it is always kind of interesting to see what that dividing line is going to be. I mean, we are willing to sacrifice a lot of personal privacy to get access to data. I'm willing to share my location if it tells me the - you know, where my closest Potbelly Sandwich Shop is... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Even though I know, you know, law enforcement, in many circumstances, can access that data. 

Dave Bittner: Yeah. 

Ben Yelin: Not to mention, you know, data brokers can buy it. But this seems like they - there's a line that the public and our representatives are just not willing to cross. 

Ben Yelin: Now, the interesting question becomes, what happens to all of the other clients of ID.me? Ten federal agencies, including the Social Security Administration, Department of Labor, Veterans Affairs - they at least have purchased contracts with ID.me. And then many states, including the four biggest in the country - California, Florida, New York and Texas - have contracts with ID.me as well - not to mention, you know, hundreds of private businesses. 

Ben Yelin: So, you know, I think the - it's particularly egregious for the IRS because of the nature of the information the IRS has in its databases. It's very valuable financial information. So the risk is increased. And the IRS has been probably the top government target for hackers since hacking became a thing... 

Dave Bittner: Yeah. 

Ben Yelin: ...Just because of the valuable information there. So it'll be interesting to see whether this sort of starts a trend where ID.me's entire business model collapses 'cause this just seems like a step too far for people. 

Dave Bittner: Yeah, yeah. And, I mean, I could - I don't know a whole lot about ID.me, so perhaps they have alternatives to just facial recognition that they can provide for people, providing this sort of identity as a service, which it seems as though what they're - is what they're doing - make it so that, you know, someone like a government agency doesn't have to build that in-house, that they can contract it out. I mean, that seems reasonable to me. 

Dave Bittner: But, you know, like you say, there's a lot of pushback here in the - I was actually surprised at how quickly the IRS kind of changed their tune on this. 

Ben Yelin: Yeah. Usually, they're not that responsive, you know... 

Dave Bittner: (Laughter). 

Ben Yelin: When we complain about other things to the IRS, they're not always... 

Dave Bittner: That's true. That's true (laughter). That is true. 

Ben Yelin: I will say, even beyond the security concerns here, there are some accessibility concerns. So they quoted somebody who lives in our neck of the woods, suburban Maryland, who was told that he had to use ID.me to access his tax records. And he has a disability in his arm, so he needed his daughter's help to run through the process. You know, there are people who don't have cameras on their computers or who don't have smartphones... 

Dave Bittner: Right. 

Ben Yelin: ...Or who don't have driver's licenses or other forms of government-issued IDs, but they still have to produce tax returns, meaning they still need access to IRS data. So even if you stripped all of the security concerns aside - which we are not doing - you still have those accessibility problems. 

Ben Yelin: So I just think there have to be better ways, whether that's, you know, standard multifactor authentication - obviously, you're going to need a smartphone for that - or, you know, other ways - last four digits of your Social Security number, you know. I think there have to be other methods that are just not as intrusive and not as valuable for potential cybercriminals. 

Dave Bittner: Yeah, yeah, absolutely. All right. Well, we will have a link to that story. Again, it's from The Washington Post. We'll link to that in the show notes. 

Dave Bittner: My story this week comes from the Lawfare blog. This is written by Justin Sherman. He is a fellow at the Atlantic Council's Cyber Statecraft Initiative. It's titled "The Open Data Market and Risks to National Security." And in this article, Justin Sherman really outlines the case that the fact that the data broker market is wide open, basically unregulated. 

Dave Bittner: He points out there are a couple of things that are regulated. There's HIPAA. There's FERPA, which is the Family Educational Rights and Privacy Act. But other than those two things, it's really the Wild West out there when it comes to data gathering, selling data, aggregating data. One of the things that he points out in this article that I think is also noteworthy is this whole notion of inferred data... 

Ben Yelin: Right. 

Dave Bittner: ...Which you and I have talked about before where, you know, if I know - if I have location pings for someone's phone and at night that phone is at my home address and during the day it's at my work address, pretty easy to figure out who that is, right? 

Ben Yelin: Yeah, I think I found Dave Bittner if I have those two pieces of information. 

Dave Bittner: Right. Right. 

Ben Yelin: Yeah. 

Dave Bittner: But the case that is being made here is that there are actually national security issues here. They point out the cases that got some press about folks in the military and in the intelligence community who, through their - things like their Fitbits or their smartphones - things like that were tracked to locations that were supposed to be secret locations. It's an interesting argument here. What do you make of it, Ben? 

Ben Yelin: Yeah, I think there are national security implications. And we just don't have proper enforcement mechanisms without a federal statute that deals with data privacy. And I think a point he makes in this article is that puts us at a competitive disadvantage with some of our closest competitors. 

Ben Yelin: When you have countries - they mention Israel and Brazil here - that actually have data privacy regulations, you know, it makes us particularly vulnerable to the Chinas and the Russias of the world if we don't have this system to protect our data. And that certainly makes us vulnerable from a national security perspective. 

Ben Yelin: We do have some agencies that have a role in, you know, trying to curtail the worst national security risks inherent in data brokerage. So they talk about the Committee on Foreign Investment in the United States, which they screen foreign investments in U.S. companies for national security risks. And the dating app Grindr had been obtained by a Chinese firm, and the committee ended up forcing the firm to sell the dating app back to a U.S. company because, as you might expect, there's some very sensitive data being held. 

Dave Bittner: Right. 

Ben Yelin: It's great that this governing agency exists. But they're limited in scope, and they're just limited in capabilities. You know, the amount of data that's collected and aggregated by our foreign adversaries that could be used against us in a whole different number of ways that could jeopardize our national security greatly exceeds our organizational capability or our national capability to protect that information. So I think it really does present a significant risk. 

Ben Yelin: I think we generally talk about these issues mostly in the domestic context. So when data is being sold to a private firm, you know, how is that firm going to exploit that data in a way that might inhibit your privacy protections? Or is that data going to be sold to a government or law enforcement agency that might be used to arrest you and prosecute you? I think this is a really interesting angle here because a foreign adversary having sensitive information on people in the United States certainly presents major national security risks. 

Dave Bittner: I wonder what legislation like this would look like. I mean, I suppose we could look to some of the other nations who've done this. Obviously, there's - you know, GDPR is the global big stick that's out there. But I guess what I wonder is, are we at the point where these data brokers are such big business that they have been successful at lobbying, at - you know, at capturing this market where it's going to be hard to dial them back without shutting them down? 

Ben Yelin: Yeah. I mean, you know, you could do things like ban law enforcement agencies from collecting location data without a warrant. That's been done to a certain extent in the Carpenter decision, but not in all circumstances, particularly when we're talking about shorter-term surveillance. You can institute all different types of federal privacy regulations. But, you know, until we actually enact those, I think we're really in a relatively nebulous area. 

Ben Yelin: There are certain state, as we've talked about, regulations of privacy data. We have CCPA. We now have this new law in Virginia, so there's sort of a patchwork. And then there are certain federal statutes where covered entities, like in HIPAA, have to protect that data. But the practice is just not very widespread at this point. And until we get there through data privacy legislation, I just don't think we're going to move forward. 

Ben Yelin: And you're right that there is a strong lobby against this. And it's not just big moneyed interests. I mean, we've talked about rank-and-file individuals, people who don't have a financial stake in this, really enjoy the benefits of sharing their data, having their data sold to data brokers. We get personalized ads on Facebook. You know, we get restaurant recommendations based on our location. These are benefits that we receive. 

Ben Yelin: You know, most people aren't aware that the cost of obtaining these benefits is sharing your sensitive data with a private company. But I do think there would be some pushback not just from Big Tech companies, but from consumers if you had the type of strong federal data privacy regulation that's at issue here, that's recommended in this article. 

Dave Bittner: Yeah. Another thing that caught my eye in this article is that they make the case that the executive branch could use the Federal Trade Commission as the enforcer of this sort of thing, although it's a little - I guess that's part of the issue here, is people aren't exactly clear who would be the best lead agency here. 

Dave Bittner: You've got representatives Jamie Raskin, Katie Porter and evidently 42 other - of their colleagues have signed on to a letter to the FTC, and they outlined a few things that they want to see happen. The three they list here are they want to define the sale, transfer, use or purchase of precise location data collected by an app for purposes other than the essential function of the app - they want to label that as an unfair act or practice. That seems right to me that if you don't need to collect a piece of data for the actual purpose of the app, then just vacuuming up everybody's data because you convinced someone to click on a EULA, that's out of - or should be out of bounds. 

Ben Yelin: Right. 

Dave Bittner: The second one is, define app developers' mislabeling of users' location as anonymous as a deceptive practice. You and I have talked many times earlier in this segment about how easy it is to deanonymize location data, right? 

Ben Yelin: Right - anonymous definitely in scare quotes. 

Dave Bittner: Yeah. And then the third one is to enforce its regulations against companies abusing consumers' location data through its penalty authority, so the FTC has the ability to penalize people. You know, it'd be interesting to see if this gains any traction here, but I find it noteworthy that the executive branch - the case they're making in this article is that the president doesn't necessarily have to wait to try to get something through Congress to try to move the needle on this. 

Ben Yelin: Yeah. Now, that's good. One thing that's mentioned in this article that I think is a really important point is it's not clear that these types of regulations would fall within FTC's jurisdiction. So when we've seen previous enforcement actions against data brokers, these were usually against individuals or companies who were engaging in some type of scam. So, you know, using false pretenses, they were able to collect information, whether that was a phishing email or anything else. That's very clearly an unfair or deceptive business act or practice, which would fall under the Federal Trade Commission's authority. 

Ben Yelin: It's less clear whether that would be the case if we're talking about a company, without, you know, trying to scam you, selling your data to a data broker. So I think Congress could step in and clarify that such regulations, like the ones mentioned in this Raskin and Porter bill, would fall under FTC's authority. So even though them writing a letter is helpful and, you know, might spur the FTC to action, I think to be on really solid statutory ground, it would be wise for our members of Congress to pass a bill clarifying that the FTC has the power to institute these regulations. 

Dave Bittner: Well, I'm sure it'll happen quickly, Ben, because of how highly functional the U.S. Congress is at this moment. 

Ben Yelin: Yeah, they'll probably pass it in the next 24 hours. 

Dave Bittner: (Laughter). 

Ben Yelin: Once you have a good idea, you know, bam - committee hearing - done. 

Dave Bittner: You know what? If you and I call our congresspeople and just make the request, it'll make its way through just quick as can be, right? 

Ben Yelin: I will say, they do listen to phone calls. So if this is something you really care about - now, will they listen to one individual phone call? Maybe not. But if, you know, hundreds of people call about the same issue, that's the one thing that might actually move members of Congress. The problem is, besides you and I, you know, and our listeners, there aren't as many people who are passionate about this as they are about, you know, pick your sexy issue of the day. So... 

Dave Bittner: Right, right (laughter). 

Ben Yelin: But that's the one way to get in your member of Congress' ear. 

Dave Bittner: All right. Well, it is an interesting read, for sure. Again, that's over on the Lawfare blog. We will have a link to that in the show notes. 

Dave Bittner: We would love to hear from you. If you have a story you'd like for us to cover or a question for me or for Ben, you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Patricia Thaine. She is the CEO of a company called Private AI, and they work on using artificial intelligence to protect consumer privacy and help developers of apps do just that - so interesting conversation about consumer privacy. Here's Patricia Thaine. 

Patricia Thaine: No. 1 is not understanding what it means to protect consumer privacy in the very first place, then what it means from a legal perspective, not just from a conceptual understanding, and then trying to figure out, given this, how do we still do what we need to do with the data? How do we still capture what we need to capture? And what kind of tools are available to help us out? So it's a very complex place for people to be playing in and for developers who have had no background in privacy in their education to start diving into it fresh. 

Dave Bittner: Yeah. I mean, I'm thinking about all of the regulations we have around the world and in what is truly a global economy online these days. I mean, you think about California's regulations and GDPR. I suspect this is something that is challenging, particularly, as you say, for a newcomer to navigate. 

Patricia Thaine: Exactly. So we are seeing mainly managers are the ones who are in charge of deciding whether or not something needs to be privacy-preserving and how - what to go about it - how to go about it, what kind of tools to use. But more and more, developers are becoming aware of what it takes to be privacy-preserving. And more and more, it's going into their hands. So it's a pretty interesting future that lies ahead for us. 

Dave Bittner: And when you say privacy-preserving, how do you define that? What goes into achieving that? 

Patricia Thaine: Great question. So it's about a few things. Really core to it is consent. So privacy is really about consent - giving the user the ability to choose what happens with their data, whether or not they can delete it, whether or not they can request for it to be de-identified or anonymized or redacted. It's also about telling the users where the data's going, are you reselling the data, what kind of processes you're using their data for. Is it actually central to the service that they signed up for? 

Patricia Thaine: And then another aspect of it that comes up very often in regulations is data minimization. So are you really - do you really need the data that you're collecting, or are you just collecting a whole bunch of data that is then going to be a minefield of toxic waste if you get hacked? And those two are pretty prevalent across the world in the data regulations that are popping up. 

Dave Bittner: You know, it strikes me as you describe those things that it seems like kind of an ideal list of something to aspire to for companies. But I think there's a lot of folks out there, particularly here in the United States, who feel a little cynical about that. You know, we see so many stories about our data being shared and collected and, you know, EULAs that are too long to read or make sense of and apps vacuuming up as much as they can. Is there a sense that we're in the midst of a transition there, that there's a recognition that we need to make some changes and that more regulation is coming? 

Patricia Thaine: More regulation is definitely coming year over year - adaptations to current regulations as well. And, yeah, there is cynicism. There was cynicism about whether or not we can get technology to catch up to the law in this case. Usually we're criticizing the law for not being able to catch up to technology. Now it's the other way around. And in the span of only a few years, we saw some really exciting things happen. 

Patricia Thaine: For example, companies that had data that were complete messes had to figure out how to organize them. So new innovations started popping up around how to categorize data. No. 1 issue - what do you have in the troves of data that you collected? And then new companies started popping up around privacy-preserving machine learning, including for federated learning, some really cool open-source projects around there as well, new ones about identifying what kind of personal information is associated to a specific individual because that's a core requirement of data protection regulations like the GDPR. 

Patricia Thaine: So it is complicated. It's getting less and less complicated as the years go on. And it'll become easier for companies to just be able to dip their toe in the water - or their whole leg, if that's what they choose to do - right off the bat with all these innovations that are coming forward. 

Dave Bittner: And so the folks that you work with in terms of, you know, being able to engage with a company like yours to help them along this journey, how does that interaction work? What are the types of things that they're looking to outsource? 

Patricia Thaine: Yeah, it's a great question. So we started Private AI because we noticed there was this pretty big gap where it was difficult for developers to integrate privacy into their pipeline. We were specifically thinking of it in terms of NLP teams or teams using AI for unstructured data because the data is really messy. 

Patricia Thaine: And we were talking to a bunch of folks. We noticed that they were either building tools like ours, which is about data minimization, redacting personally identifiable information from text, images, video, audio. And they had to build this internally or use third-party services that would not get the accuracy that they wanted or that would also claim rights to their data and then use it for training their own models, which, you know, if you're sharing personal information in the first place you don't want happening. 

Patricia Thaine: So we really let users plug in this system that allows them to identify personal information, including direct identifiers and quasi-identifiers, for chatbots, for customer service tech, for being able to discover what is in your database in the really messy, unstructured data because a lot of the tools out there are really for structured data. 

Patricia Thaine: If you look at what happened, for example, at Scatter Lab, which is a Korean chatbot company, they had been collecting billions of chat records that contain sensitive information between lovers. And then they weren't doing anything to protect the privacy of this data as it was being used to train their machine learning models, and the chatbots were spewing out names, usernames, exact addresses of their users. So a solution like ours really helps AI companies avoid that kind of nightmare scenario. 

Dave Bittner: You know, it's something I hadn't really thought about. And, boy, it really brings it into focus when you talk about, you know, the chat conversations between lovers. I was thinking even, you know, with a - for example, I can imagine somebody in a chatbot with a service provider saying, oh, goodness, you know, I sure hope that I get this up and running. My mother's health has been terrible. She's 78 years old and, you know, she doesn't have much longer, so I hope we can get this going. I mean, that is full of personal information that I suspect the companies don't want to be hanging on to. 

Patricia Thaine: Exactly. And a lot of the times, companies will say, hey, don't include your personal information on this chat or in this phone call, but people still will, right? They're talking to another human. One thing leads to another, and you're talking about your stay at the hospital the other day. And that's something that is toxic for, you know, financial institutions or any organization that's not dealing with health care information and doesn't want to know. 

Dave Bittner: You know, I think organizations are more and more used to engaging with other companies who are providing things as a service. And it seems to me like, you know, this is falling right into that - the types of things that you all are providing here. If I'm a developer, I don't have to get bogged down in the privacy elements of the things that I'm developing. Do you suppose that's the shape of things to come, that more and more we're going to see a kind of a granularity here of people relying on third parties to make sure that things like this are properly taken care of? 

Patricia Thaine: I sure hope so because privacy isn't easy. And that has been the case for security for a while. People say, don't build your own crypto. It's perfectly valid. You - unless you're a team of expert cryptographers, you should not be building your cryptography. These are very complicated problems similar to cryptography, where it takes a lot of knowledge, a lot of specialization and a lot of understanding of where the corner cases are in order to build something that's robust enough to be compliant with the regulations. 

Patricia Thaine: For now, teams that are building it themselves might get a pass, but if there is a hack, people are going to know what kind of information they were actually collecting and what they were missing with their systems. And there's also always the fact that you're sending this to a third party, or making this happen through a third party rather than something that you build internally tends to be a lot more trustworthy as well. You've got that third-party stamp of approval. 

Dave Bittner: Can it go both ways, though? I mean, I could imagine that, you know, it's great to have a third party to be able to say that, you know, our third party is meeting this standard. But on the other side, I could imagine you probably have customers who come to you and say, hey, this is great. Now prove it. 

Patricia Thaine: So it's getting less and less often. But it's still the case that people will want to compare our system to their internal solution or to AWS Comprehend or, you know, to a number of systems. And the way that we deal with that is, one, we have a toolkit that they can just use to compare our system to whatever else is on the market. 

Patricia Thaine: But it's really about, keep track of the regulations. Make sure that you're keeping score of, you know, what counts as personal information across the world in these various regions. What does this particular team want? And this requires quite a bit of privacy expertise that a lot of companies are not equipped to handle themselves. 

Patricia Thaine: So what we find is you'll have internal systems that are built to pick up the basics in pretty basic scenarios, like names or, you know, in some cases, regular expressions for credit card numbers. But as soon as you get into something more complicated, like my credit card number is 532 - no, wait a second; that's 23 - anything related to natural language gets pretty dicey as soon as you don't have something more complex and robust in place. 

Dave Bittner: I see. Do you suppose we're headed towards a time when this is considered a competitive advantage, that companies are going to be able to raise that flag and say, hey, we're a privacy-first organization, and these are the things we're putting in place to protect you and your information? 

Patricia Thaine: Yeah, that's a great question, Dave. I mean, a lot of privacy-first organizations are showing quite a bit of growth - for example, Brave, DuckDuckGo and so on - and it's really the privacy that's putting them forward. 

Patricia Thaine: Is this going to be the case across the board? So what we're seeing in a lot of businesses is that their customers will actually not give them access to data unless they do something like this that's robust to protect the data. And so the competitive advantage might come from that stamp of approval, that more public stamp of approval, but it might also come just from the access to more data to train your models, for example. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: Really interesting interview. I think, you know, it kind of goes into what we were talking about with the last segment. Some of the coolest features of applications are the very things that might jeopardize consumer privacy. So I think it's becoming part of the culture of these organizations to really weigh those values, you know, whether you want to have incredibly cool capabilities that allow you to, you know, have a killer app that's going to get lots of downloads on the app store or you want to focus on protecting consumer data. Those interests aren't always in conflict, but sometimes they're in conflict. So that's, you know, something that stood out to me. 

Dave Bittner: Yeah. I can also see the real attraction from a developer's point of view of being able to outsource this particular part of your app. You know, and we see this with, you know, people using open-source software. And, I mean, no - it's my impression that no apps are sort of written from whole cloth anymore. You take a little from here, a little from there, you know, because why reinvent the wheel? 

Dave Bittner: But in this case, it seems like, you know, from a regulatory, privacy, convenience and safety point of view, if you've got someone else who's worrying about that, that's a pretty good value proposition if I'm an app developer. 

Ben Yelin: Absolutely, especially someone who has experience working with a bunch of different app developers, institutions, et cetera. 

Dave Bittner: Right, right. All right. Well, again, our thanks to Patricia Thaine from Private AI for joining us. We do appreciate her taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.