Caveat 6.9.21
Ep 81 | 6.9.21

Data privacy laws are failing.

Transcript

Rita Garry: The key to federal legislation has to be to remove the burdens from the consumer in terms of enforcement and to set parameters around businesses to respect people's privacy.

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben looks at new legislation restricting police use of DNA. I've got the story of a proposed algorithm transparency bill. And later in the show, my conversation with Rita Garry from Robbins, Salomon & Patt on how data privacy laws are failing U.S. consumers and businesses. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben. Let's jump in to some stories this week. Why don't you start things off for us? 

Ben Yelin: So this is an issue we've discussed a few times in the past. It's the law enforcement use of DNA genealogy. And there are a couple of new state laws that have been enacted in the past few weeks relating to this issue, one of them here in our home state of Maryland. So my article comes from The New York Times - "Two New Laws Restrict Police Use of DNA Search Method." The other state is Montana, which I think is notable because we kind of have two ideologically opposite states... 

Dave Bittner: Right (laughter). 

Ben Yelin: ...Enacting pretty similar laws here. 

Dave Bittner: Yeah, interesting. 

Ben Yelin: Yeah. So what will these laws do? In the case of both Maryland and Montana, a court will have to sign off on law enforcement using this DNA database, using - you know, putting these thousands of DNA markers from a crime scene into one of these genealogy websites to try and find a hit. That is going to require court approval starting on October 1 here in Maryland and then in the near future in Montana. 

Ben Yelin: The Maryland bill goes a little bit further and has a couple of other interesting provisions. One of them is that the state and its law enforcement agencies can only use genealogy companies that properly warn their consumers that law enforcement may use this data as part of criminal investigations. 

Ben Yelin: So a couple of these companies - really, like the high-profile ones, 23andMe and Ancestry - they have the largest databases. And their company policy is to acquire a search warrant, some sort of judicial authorization before granting access to their databases. But some of the smaller companies - they mention GEDmatch and Family Tree DNA here - they are happy to cooperate with law enforcement. They routinely cooperate with law enforcement. 

Ben Yelin: And, well, the customer does have the choice as to whether - you know, there's a little box to check as to whether you're willing to allow law enforcement access to the data that you share. The default is that they are able to share that data. And the fine print that these companies use isn't really clear enough to your average consumer that that is their intention. So it's possible that these two companies could just stop doing business in the state of Maryland because of the burdens of this regulation. 

Dave Bittner: Yeah, one of the things that strikes me in this article is that it addresses this notion of deceptive stories that law enforcement can use to get someone to submit DNA evidence. 

Ben Yelin: Yes. So they talked about an incident in Florida in 2018 where a woman was asked to provide a DNA test. They told her that they thought she was related to a dead person they were trying to identify, but it turns out they were investigating her son for a crime, and he was arrested and charged with murder. Sometimes they're testing little discarded items that go into the trash to pick up DNA. So they're not exactly being transparent in their use of DNA evidence and the use of these DNA databases. 

Ben Yelin: I will try and be a little devil's advocate here from the perspective of law enforcement. This is an incredibly effective investigative technique. There is the serial killer in my native state of California who was unidentified for something like 40 years, committed a series of gruesome murders in the 1970s, was dubbed the Golden State Killer. They uploaded some DNA from the crime scene into some of these databases, found some distant cousins and tracked down an individual who was subsequently arrested and charged. 

Ben Yelin: So this is, you know, something that, potentially, in many circumstances is going to be very helpful to law enforcement in catching, you know, some of the worst bad guys out there. But it is intrusive. And I think one of the reasons it's particularly intrusive is the people who submit the data to these companies, who innocently want to learn more about their ancestry, have no idea that they're aiding in potential criminal investigations of their own relatives. And that just kind of feels wrong. 

Dave Bittner: Yeah. It is a funny thing. I mean, imagine if - you know, if I could take your fingerprints, and somehow your fingerprints indicated your relationship to other family members (laughter) 'cause that's basically what happens with DNA. You can connect the dots, connect that web of genetic information. 

Ben Yelin: Yeah. And, you know, in the Golden State Killer's case, you have a very limited strand of DNA from a crime scene 40 years ago. You upload it to the database. And a lot of people have used these databases. There's a lot of data in there, largely because many people are interested in their own genealogy, family history. I don't begrudge them; I'm interested in mine, although I have to say I'd probably use one of the companies that would require a warrant (laughter) from law enforcement. 

Dave Bittner: Yeah. 

Ben Yelin: But there's a lot of data in there, and you can draw a lot of different connections. Even if it's not a close relative who has submitted their data, the odds are that second and third cousins have data in this database of any alleged killer or criminal in the United States. So, you know, it's much more expansive than the national database they talk about here, CODIS, which is just much more limited. That is a national database of DNA, but it only collects the DNA from known arrested criminals or from crime scenes. So it doesn't have kind of the robust, broad data that you get from these genealogy sites. You know, this law enforcement technique certainly comes with privacy concerns. 

Ben Yelin: The Maryland effort was actually spearheaded by a professor at the University of Maryland School of Law, my institution. And this is an issue that this professor, Natalie Ram, has focused on for a long time. This legislation had been introduced in the past. Finally, it received a vote this year, and the governor allowed it to go into law without his signature. So it might create some complications for law enforcement agencies in Maryland. It does remove a potential investigative tool. But I think it's all about balancing what you get from the use of those databases verse those alleged privacy invasions. 

Dave Bittner: Yeah, and that was my next question for you is, from a practical point of view, what do you think this does? Is this a speed bump in law enforcement's journey to try to get this DNA evidence? Is it a little bit of oversight? What does it really mean? 

Ben Yelin: So it's kind of all of the above. There is this layer of oversight when you're requiring a court order. It means that law enforcement itself, without, you know, obtaining any sort of court approval, can't just go into these databases and start drawing connections to alleged criminals. And so that's a major inhibition on law enforcement and might slow down some investigative work. It's not that hard to obtain judicial authorization if you already have enough evidence. 

Ben Yelin: Where it's useful for law enforcement is when they're working on a hunch, maybe not enough evidence to even get a court order, but if you plug it into one of these large DNA databases, you might get lucky and get a hit. 

Ben Yelin: The other provision that I think is going to be an inhibition on law enforcement is they are requiring that the only people who conduct these searches be trained genealogists. Genealogists working on these cases must be professionally certified by 2024. That is not a credential that currently exists. 

Dave Bittner: (Laughter). 

Ben Yelin: So we're going to have to develop a way for law enforcement to get that credential before that requirement goes into place in 2024. So that's going to be pretty limiting. 

Dave Bittner: Yeah. 

Ben Yelin: My guess is that there are a lot of law enforcement agencies, law enforcement officials, who think this is an unnecessarily burdensome law - we saw a couple of them quoted in this article - and the same can be said for the Montana law. But I'll note that the governor of Maryland, Larry Hogan, certainly had the opportunity to veto this bill. He didn't. And... 

Dave Bittner: And he does not hold back on his veto power (laughter). 

Ben Yelin: Absolutely not. He loves to use that veto pen. 

Dave Bittner: Right. 

Ben Yelin: He's used it in a number of circumstances. 

Dave Bittner: Right. 

Ben Yelin: So he certainly is aware that it exists. 

Dave Bittner: (Laughter) Right, right. 

Ben Yelin: He is not a rubber stamp on pieces of legislation in Maryland. 

Dave Bittner: Yeah. 

Ben Yelin: And I'm guessing that this was a comprehensive effort going back a couple of years. The Legislature has held hearings with privacy experts, law enforcement agents, and they came up with what I think is a pretty equitable agreement, one that really balances the need to protect the privacy of the people who use these services versus keeping this tool available for - to law enforcement when they need it. And it may not have as big of a practical change as one might expect, considering that the two biggest companies, the ones that you've probably heard of, already require court orders before releasing that information. 

Dave Bittner: All right, well, that is from The New York Times. We'll have a link to that in the show notes - again, the article written by Virginia Hughes. 

Dave Bittner: My story this week is about some legislation that has been introduced. It is described as an algorithm transparency bill. This particular story comes from a website called nexttv.com, which is a broadcasting and cable website, but they happen to be covering this. And this is about Senators Ed Markey, a Democrat from Massachusetts, and Doris Matsui, who is a Democrat from California. They have introduced a bill to boost transparency and to try to unveil what's going on with some of these algorithmic systems that these social media companies are working on. They're calling this the Algorithmic Justice and Online Platform Transparency Act of 2021. And... 

Ben Yelin: AJOPT, if you will. 

Dave Bittner: (Laughter) Not a good one. Not a good one. No, no. 

Ben Yelin: Not good in the acronym game. 

Dave Bittner: And they say that it's going to help prevent algorithms from excluding certain people from seeing online advertising - racial minorities for seeing housing ads, for example, or people of gender seeing job ads - something that we've covered here, where folks have found that, for example, Facebook was excluding certain groups from ads for things which you're not supposed to do. 

Ben Yelin: No, that would be discrimination. Yeah. 

Dave Bittner: (Laughter) So we'll go through some of the details here from Senator Markey's office. They want to prohibit algorithmic processes on online platforms that discriminate on the basis of race, age, gender, ability and other protected characteristics. That seems pretty straightforward. They want to establish a safety and effectiveness standard for algorithms. They want to require online platforms to describe to users - in plain language (laughter) - the type of algorithms they're using. 

Ben Yelin: I'm sure the users will read that extremely carefully and understand it entirely, 

Dave Bittner: Right. They'll just - yeah, they'll just tack it onto the end of the EULA. That would be really effective. They want to require online platforms to maintain detailed records describing their process. This part I thought was interesting - they need to make it available for review by the Federal Trade Commission. That's actually interesting. 

Ben Yelin: This is like the Dave Bittner proposal... 

Dave Bittner: (Laughter). 

Ben Yelin: ...To have an FDA for algorithms... 

Dave Bittner: Right? 

Ben Yelin: ...Where in order for an algorithm to be approved for use in the United States, it has to go in front of some sort of federal agency. So I don't think it would be quite as strong as the ideal Bittner proposal. Their... 

Dave Bittner: How could it be? (Laughter). 

Ben Yelin: How could it be? Yeah, I mean, we can't live up to your namesake proposal here. But it would task the FTC with ensuring that these algorithms comply with privacy and de-identification standards. And that's a big deal. So it would require a layer of oversight on these companies. 

Dave Bittner: Yeah. It'll require them to publish an annual public report detailing their content moderation practices. And it'll create an interagency task force with folks from the FTC, from the Department of Ed, from Housing and Urban Development, Commerce, Justice. And they will all investigate the discriminatory algorithmic processes employed in sectors across the economy. So there's something for everyone here, Ben. 

Ben Yelin: Yeah. One thing that stuck out to me is this fifth provision requiring online platforms to publish annual public reports detailing their content moderation practices. That might be a gift to potential Republicans if they want to sign on to this bill because Republicans in Congress have complained about the content moderation practices of these platforms, alleging that they're biased against conservatives. So that might be a little carrot that Senator Markey and Representative Matsui are offering Republicans, saying if you can join us in reviewing these algorithms, making sure that they are not racially discriminatory, they're not discriminatory in other ways, that they're using best practices for keeping user data anonymous, then you'll get your annual report on how these companies moderate content on their platforms. So I think it's kind of an olive branch here for potential bipartisanship. 

Dave Bittner: Yeah, interesting. I mean, I guess to me, the main thing that this hopes to achieve is trying to shine some light on what's going on in these black box algorithms that the tech companies, up till now, have really not provided any transparency on. 

Ben Yelin: They haven't. And what we found out about algorithms has come through investigative journalism, leaks, et cetera. There isn't really a federal agency right now that does a comprehensive review of these algorithms. And yet these algorithms are extremely powerful. I mean, it's certainly an area of public concern, not just, you know, because they might be leading a generation of our youth into harmful content when they, you know, go to YouTube and end up in a rabbit hole of watching, like, white supremacy propaganda. That's bad, obviously. 

Dave Bittner: Right, right. 

Ben Yelin: But also, you know, it could potentially violate federal law if it is discriminatory, violating civil rights statutes. If Facebook's algorithm is preventing them from showing house listings to certain racial minorities, that is a form of illicit discrimination. And that's something that Congress has given itself the power to ameliorate. So I think that level of transparency is extremely important. We don't have it now. 

Ben Yelin: You know, I think every reasonable person would agree that because these algorithms have become so important, not just, you know, in the online world, but in terms of, like, the information that people are getting. Most people get their news from Facebook. That shapes our worldviews. You know, so these algorithms are extremely important as a public policy matter. So I think this effort at transparency is overdue. 

Dave Bittner: So this is being proposed by two Democratic senators. But my sense is that this sort of thing has a lot of bipartisan support. You mentioned earlier that they might be offering up one of these elements to seduce their Republican friends across the aisle to perhaps support this. What do you think going forward? I mean, is this something - this is the kind of thing that you think could have broad support? 

Ben Yelin: The devil is always in the details, and everybody in this type of bill can find one thing that they find objectionable, and that can kill the whole thing. You know about my skepticism on whether Congress can pass anything. 

Dave Bittner: (Laughter) Right. Not unfounded skepticism - it's evidence-based skepticism, right? 

Ben Yelin: It is evidence-based skepticism. Yeah, I'm not pulling it out of thin air. 

Dave Bittner: Yeah. 

Ben Yelin: And with a packed legislative agenda, are they going to have time in this legislative session to consider a bill like this? It might get a committee hearing or two. I wouldn't foresee it passing in the immediate future. This, like a couple of other pieces of legislation we've discussed, is sort of laying out policy principles as a basis for discussion that might... 

Dave Bittner: Right. 

Ben Yelin: ...Lead to some sort of future regulation. But I think this is the opening bid from Senator Markey and Representative Matsui on, we have this problem; we have to address it somehow. Here's our proposal as to how we would address it, and let's use this as a jumping-off point to hold some committee hearings, get some ideas. Maybe even without passing this piece of legislation, just by holding hearings, you can get some information on the discriminatory practices of some of these tech companies as it relates to their algorithms. And that might have a public policy benefit in and of itself. 

Ben Yelin: So would I go on a political betting website and bet for this legislation passing? No, I would not. But I don't want to diminish its - just the fact that it's been introduced. I don't want to diminish its importance. 

Dave Bittner: And I suppose, I mean, it's signaling to those tech companies, as well, saying, you know, hey, we can do this the easy way or the hard way. Right? 

Ben Yelin: Yeah, exactly. You should be transparent, otherwise we will make you be transparent. And that kind of tactic works. I mean, Facebook has been open and willing to offer more transparency into its own use of algorithms, probably because they understand this pressure coming from public policy makers both at the state and federal level. And, you know, I think we could see that with other companies as well, that they might be willing to play ball if this sword is being hung over their head, a legislation that not only would shine the light on on their algorithms but subject the algorithms to federal regulation. 

Dave Bittner: Yeah. All right. Well, again, that article is over on the Next TV website written by John Eggerton. We'll have a link to that in the show notes. 

Dave Bittner: We would love to hear from you. If you have a question for us, you can call in. Our number is 410-618-3720. Or you can send us email. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Rita Garry. She's from the law firm Robbins, Salomon & Patt. And our discussion centered on the notion of data privacy laws and how they might be failing U.S. consumers and businesses. Here's my conversation with Rita Garry. 

Rita Garry: Currently, I think that we're on the precipice of finally getting congressional consolidation around the idea that consumers alone can't be burdened with all the enforcement of their privacy rights on the legislative side. And on the consumer side, I think we're seeing an awakening with an understanding of the importance of their personal information protection. And the difficulty is matching the two in a way that makes sense for a digital society and economy. 

Dave Bittner: Yeah. I mean, you know, right now, we have sort of this patchwork of legislation on a state-by-state level, and I suppose there's been the notion that, you know, larger states like California could lead the way and sort of, you know, set the standard. How has that played out so far? Are we seeing that happen? And is that an effective way to go at things? 

Rita Garry: We're seeing it happen, that's for sure. But it's taken its time. Virginia is the only other state that has actually enacted data protection around personal information privacy. However, they diverged significantly from California away from the crucial pieces, which is the private right of action. So what we're seeing is a whole slew of states attempting to introduce bills or introducing bills but not necessarily passing them because they can't seem to get over the push-pull between wanting to protect consumer privacy but at the same time not wanting to unduly burden businesses and expose them to risk for litigation. 

Rita Garry: So what I think we'll see in the end here is as we grow into finding a compromise in Congress - which I predicted would happen several years ago but is now just beginning to take shape. And we've already got five pieces of federal legislation that have been pushed out, but none enacted yet. But there's a growing consensus that the key to federal legislation has to be to remove the burdens from the consumer in terms of enforcement and to set parameters around businesses to respect people's privacy and use it within the context in which it's given. 

Dave Bittner: Can you unpack that notion of the private right of privacy for us? I mean, why does that matter? How does it affect consumers directly? And what are the alternatives that they're faced with not having that? 

Rita Garry: The private right of action is associated with individuals having the right to seek redress in the courts of laws for violations, be they statutory or be they related to actual harm that's caused by a company misusing their personal information. And it is part and parcel because we're talking about a privacy right. And there's lots of historical precedent for, you know, it's my right, so I should be able to enforce it. 

Rita Garry: The flip side of this is avoiding the private right of action, which is a big piece in the bucket here, is that you've pushed down enforcement onto state attorney generals offices, which may or may not have the appropriate resources to pursue these things. So the burden is on me and you as consumers sharing our personal information to protect our personal information. But the complexity of that shouldn't depend on what state you live in because the digital economy, it doesn't respect state boundaries. And why should the rights of a California citizen be different than the rights of somebody else in a different state? 

Rita Garry: So the alternative is you have class-action lawsuits which don't necessarily provide the consumer with the type of protections that we're looking for but rather fines against companies that are sort of ganged up on, if you will, to, you know, enforce groups of rights. So I'm moving in the direction of hoping that Congress will actually tackle the hard pieces to this, which is the private right of action, which I think there should be a private right of action because it is part of - it's logical. You know, it's our right. But I think that our right to enforce those things have to be balanced against the burdens that are placed on businesses. So hopefully, we move that big piece, that big rock, into the bucket. 

Rita Garry: The other big rock that they have to tackle is preemption. And the concept there is that should the federal law preempt state laws? And this is controversial because we consider state legislatures as sort of the innovators of democracy. And yet it's so difficult to try to advise a company, which I'm trying to do every day. How do you comply with potentially what could be 50 states' different legislation around data protection for consumer personal information? And it's very, very difficult. I hope that Congress will tackle those, and I think they're beginning to coalesce around some of these ideas. 

Rita Garry: Personally, I support the idea of preemption for states that are inconsistent with the federal legislation once we get it passed. So preempt to the extent we have a baseline and then also put some guardrails around our private right of action. So require consumers to seek redress directly with the company that has misused the personal information and give the company an opportunity to cure and then basically heighten the standard of liability risk. So don't impose huge fines on companies for missing a statutory technicality. That sort of gotcha piece is what scares business and pushes them off in terms of trying to comply because it's so overwhelming. 

Dave Bittner: My sense is there's a lot of cynicism, particularly from consumers, you know, where we sign up for a service and we're presented with a EULA that's impossible to read, and we agree to turn over all of our information. And the alternative is, well, fine, don't use our service then. There's no negotiation there. There's no ability for us to dial in what we will and will not do. Do you think there's any hope of having that side of things addressed as well? 

Rita Garry: I do. I agree with you. And there are some who have said - gone out so far as to say privacy is dead because it's sort of necessary that we share our personal information to accomplish our goals in consuming products and services. So in the consumption of that, what we see now is growing idea that you should be able to intelligently opt in or opt out and not have it obfuscated by those agreements, privacy statements that have to be waded through and so on, so forth. 

Rita Garry: I think we will find a balance there, but I think it's going to come back to the business. They're going to have to take a hard look at their data and information management practices and procedures. And I go back to the Brookings Institute report where, you know, basically what - they're looking for businesses to respect the golden rule of privacy. Really, I mean... 

Dave Bittner: (Laughter) How quaint, Rita, how quaint. 

Rita Garry: I know, very quaint. I agree. 

Dave Bittner: (Laughter). 

Rita Garry: I agree. But, you know, a data diet doesn't hurt anybody. 

Dave Bittner: How interesting, though. I mean, we've seen in the past few weeks, you know, Apple has rolled out this ability for consumers to turn off some of the tracking. And they found overwhelmingly that the consumers are selecting to not have their information tracked. And, of course, the folks who rely on that information are screaming bloody murder that this is going to ruin their businesses. The listeners of this show will have heard me use this metaphor before. I think it's kind of like, you know, old factories complaining that they can't pollute the river anymore. You know, it's going to cost us so much money if we can't pollute this river anymore. Well, you shouldn't be polluting the river. You shouldn't be using - sucking up all of our data. Maybe that's where we need to begin. 

Rita Garry: Well, two things. It's ironic that we have all this legislation protecting consumers and we have consumers who don't seem to be interested in being protected. And I wonder what is motivating the legislative explosion when it does seem there's this cynicism about it. And I think that it's got to be an awakening on our part as consumers to understand how important it is that we try to exercise some control over our own personal information. 

Rita Garry: In Europe, of course, it seems as though that was more ingrained in culture and society than it is here. But at the same token, we've been talking about privacy in the United States since the 1890s. So we respect the fact that there exists a concept of privacy. But how do we make it work in a 21st-century digital economy? So I think we're seeing that grow. 

Rita Garry: I don't know about you, but I've become way more conscious of privacy statements on websites. And companies that don't have them, I am declined to engage with them digitally. And I think that's because if they don't bother to take the time to set forth what they're going to be doing with my information, then I'm really not interested necessarily in doing business with them. 

Dave Bittner: Mmm hmm. So it may even be a competitive advantage for companies to be able to promote the fact that they're respecting privacy. 

Rita Garry: I think so. I think it's critical that we trust the companies that we do business. And trust is built on, in this case, the concept of using my personal information within the context of the transaction we're working on. The problem I think that us consumers feel most is that sort of behavioral advertising and the tracking and the sort of behind-the-scenes use of our personal information that we are not aware of. And I was thinking the other day that, you know, wouldn't it be nice that when you go to confirm a purchase, a screen would pop up to say, by the way, once you click on this button, your personal information will be shared with A, B, C, X, Y, Z companies for the purpose of - you know? 

Dave Bittner: A privacy nutritional label, if you will, you know? 

Rita Garry: Exactly, yes. 

Dave Bittner: Yeah, yeah. 

Rita Garry: Yeah. It shows the company has good data hygiene practices and that they are making an attempt to respect that sort of expectation that I have. 

Dave Bittner: Could that be a regulatory possibility that strikes a balance, that doesn't - isn't too burdensome on the folks who want the data, but also gives the consumers more insight as to what's being gathered and how it's being used? 

Rita Garry: Well, if you mean can it all be accomplished through regulatory rulemaking? I don't know. There clearly is a impatience on the part of the FTC to exercise what rulemaking authority they have or to have that expanded. Once again, I think a purely regulatory environment is probably not the end solution. It doesn't work in every industry and just kind of puts more burden on companies to do a checklist, if you will, which isn't sufficient in my opinion. I think it has to be a philosophical as well as strategic business decision. Yes, we are going to show the consumers that they can trust us with their information. 

Dave Bittner: What sort of timeline do you suppose we're on? The noises that are coming out of Congress, do you think we're going to see anything any time soon? 

Rita Garry: I think that their plan is to have something in place by 2022. Since we've got now at least six different pieces of federal legislation that have been proposed, there's growing consensus around what we need to accomplish here. And I again refer to the Brookings Institute report in June 2020, where they offer some really practical solutions to the private right of action solution, the preemption issue and also just sort of putting the burden on businesses to realize that it is not the wild, Wild West when it comes to consumer personal information, that they have to create some sort of business wise or strategic decision about how they're going to handle it. So I do think we will - finally - I mean, I was saying that in 2017. I got fooled in '18 and '19. And 2020's been a busy year for other things. So... 

Dave Bittner: Here's hoping, right? 

Rita Garry: Here's hoping. And I know that... 

Dave Bittner: (Laughter). 

Rita Garry: I know that Big Tech and the big industry players are pushing hard for a federal non-patchwork arrangement for compliance purposes, which makes sense to me. But it also has to not be purely driven by industry interest. This kind of regulatory model is not, in my opinion, well left to the market to decide because we've seen it already over the past, you know, 10 years or so, the growing volume of personal information and the essential unfettered use of it. And I don't see any real self-regulatory initiatives being put forth that are respectful of your and my personal information use. 

Rita Garry: Regardless of what the states do or not do and regardless of what Congress does or not does, I think businesses all need to be awakening themselves to data management and information security as a business practice. And it doesn't have to be overwhelming, in my opinion. I think that you can take it apart and tackle it piece by piece and begin to get a whole (ph) - where does my data come from? How does it get to me? Who do I share it with for what reasons? - and begin to really think about that and then also think about, how are we using it, and when can I get rid of it, you know, to actually put myself on a data diet and look at data hygiene and try to tackle it because it's going to come around the corner one way or the other. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: She is much more optimistic. I hate to make this a theme of our show today, but she's much more optimistic about the near-term prospect of federal data privacy legislation than I am. 

Dave Bittner: Yeah (laughter). OK. 

Ben Yelin: I appreciate her optimism, and I hope she's right. I think it is really important for consumers, who, as she said, have started to take notice of their own privacy rights. And it's important for businesses who aren't going to want to have to comply with 50 separate state regimes as it relates to data privacy. 

Dave Bittner: Right. 

Ben Yelin: As she mentioned, so far, we have two states, California and Virginia, with pretty divergent pieces of legislation in this area. And I think the amount of state legislative action is just going to grow in the coming years. 

Ben Yelin: But I have to say, I have my strong, strong doubts that the federal government would pass a bill giving consumers the private right of action against these tech companies. Everyone will say they dislike these tech companies, people of all ideological stripes. But let's just say they have pretty powerful lobbyists, and the last thing they want is to grant a federal right of action on the part of users for data privacy violations. So I'm glad that she's hopeful, and I hope that she's right. 

Dave Bittner: Yeah. I really enjoyed the conversation with her. I say it often, though. Well, you know, one of my favorite things about being able to do shows like this one is I get to talk to really interesting, smart people about interesting topics. And she was surely one of those. 

Ben Yelin: Oh, for sure. Yeah. 

Dave Bittner: We appreciate Rita Garry for taking the time for us and sharing her thoughts with all of you. 

Dave Bittner: That is our show. We want to thank all of you for listening. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.