Caveat 4.14.21
Ep 73 | 4.14.21

Breaches are a near certainty.

Transcript

Dean Gonsowski: Now you're seeing the statistics come out that shows that breaches are a near certainty. The likelihood, I think, is close to 70% of all companies will be breached within the next two years.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's policy, surveillance law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben explains a Supreme Court justice's take on Section 230. I've got a story from WIRED that wonders about the online curation of our memories. And later in the show, Dean Gonsowski from ActiveNav on data privacy legislation and data governance. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's dig into some stories this week. We've got some good ones. Why don't you start things up for us? 

Ben Yelin: So mine comes from Ars Technica, but really, it comes from the writings of Supreme Court Justice Clarence Thomas. So to give a little bit of background, there was a lawsuit that's kind of been emanating over the past couple of years against President Trump in his official capacity while he was president, saying that he restricted Twitter users' First Amendment rights when he blocked them or muted them on Twitter. Basically, the idea is the First Amendment gives us a right to petition our government for redress of grievances, to tell the government how we feel. And by blocking those individual users, the president was curtailing people's individual rights. 

Ben Yelin: This case made it - was petitioned up to the United States Supreme Court, made its way through the court system. But the wheels of justice sometimes move rather slowly. And by the time this case made it up to the Supreme Court, it was now the plaintiffs v. Biden because Biden is the president. Of course, Biden's use of Twitter is not as prominent or noteworthy as the former president. 

Dave Bittner: To say the least. It would be hard to exceed that, I submit. 

Ben Yelin: Exactly. So the Supreme Court unanimously dismissed this case as moot. Basically, the Supreme Court will only hear cases and controversies, things that are live, active controversies. And the whole nature of the dispute, which is that President Trump in particular, and it was his personal account, was blocking people - that dispute is no longer relevant because he's no longer president. So 8 of the 9 Supreme Court justices who made this decision left it that. It had no commentary on the decision to dismiss the case. 

Ben Yelin: One of the justices, Justice Clarence Thomas, wrote a 15-page - I want to say - I won't call it a screed because I think it's well-written and well-argued but sort of an unnecessary lecture on First Amendment rights related to tech platforms and the pitfalls of Section 230. So in his telling of the case, he says that the original case centered around President Trump curtailing the First Amendment rights of other users, and he thought that that was the wrong frame to look at the potential First Amendment issues on digital platforms because ultimately it was not President Trump who was really in control of his own account. It was Twitter. And we know that because - I don't know if you've heard, Dave, but the president was subsequently banned from Twitter. His entire account was blocked and has been deactivated. 

Dave Bittner: Yeah, I think I heard something about that. Yeah. 

Ben Yelin: Yeah, it kind of made its way through the news. I've noticed Twitter is a much quieter place these days. The issue to him is - to Justice Thomas - is the power that these private companies have over these platforms. 

Ben Yelin: Now, normally, it's not the government's role to get involved in these types of private platforms. The First Amendment generally does not apply against private organizations. You know, if I'm part of a private club and they prohibit me from speaking lest I be kicked out of that club, I'm not going to have a valid First Amendment claim. What Justice Thomas is arguing here, however - and this is really the most interesting and groundbreaking portion of his argument - is that perhaps these platforms should be considered either what are called common carriers or places of public accommodation. 

Ben Yelin: So common carriers are generally, like, utility companies. They're private organizations, but they serve some sort of public purpose. And therefore, the government and the courts see fit that those entities are subject to federal regulation and must abide by the federal constitution. So it used to be things like railroads, like when you had a privately owned railroad. But things like telecommunications networks - those themselves are common carriers. 

Ben Yelin: And I think Justice Thomas is making the argument here that perhaps these tech platforms, Twitter included, could be considered a common carrier partially because it has such a large market share. So, you know, there aren't any - while there's nominal competition, there isn't any real competition and because of just the prominent place it plays in our political discourse. 

Ben Yelin: The other potential source of regulation would be to label these companies as places of public accommodation. So we see this most notably in the area of civil rights laws. So, you know, normally the government would not be regulating private organizations, but the Civil Rights Act says that you can't discriminate on the basis of race at places of public accommodation. Basically, anywhere that's opened to the general public - restaurants, hotels, et cetera - must abide by these federal regulations. 

Ben Yelin: And so what Justice Thomas is saying here is because of the prominence of these tech platforms, because of the power that they hold in terms of shaping our political discourse, perhaps Congress needs to recognize that these are common carriers or places of public accommodation and potentially subject them to the type of regulations that you see in the Civil Rights Act, i.e. prohibiting them from banning certain users. 

Ben Yelin: So this is a pretty radical notion that the federal government would be regulating tech companies to this degree. And I thought it was just a really interesting and eye-opening 15 pages of reading from the justice. 

Dave Bittner: Is it unusual for a justice to do something like this? 

Ben Yelin: It is. Some cases that the court doesn't hear - you know, the cases where they're dismissing it or they're declining to grant certiorari - you might get a dissent from a justice arguing why the case could have been taken up or should have been taken up. But, you know, in a case like this, Justice Thomas' writing didn't have much to do with the original lawsuit itself, which was about President Trump blocking people on Twitter. So I think he saw this as an opportunity - a rare opportunity to opine on the subject - on a case that's, you know, remotely relevant to the subject at hand. He could have written it in a private capacity, but I think it carries more weight in the official Supreme Court fons on - as part of official Supreme Court document. But it certainly is unusual. 

Dave Bittner: And I suppose if you're someone out there in industry, you can - you could count on this being - I don't know - a beacon of at least some of the conversations that are going on among the members of the court. Presumably - you know, obviously, Clarence Thomas has laid out his feelings here, but it could maybe inform you of which way the wind might be blowing. 

Ben Yelin: Yes. Although as one commentator said in this Ars Technica article, which talks about Clarence Thomas' writing here, it's perhaps just as notable that none of the other eight justices decided to join his opinion. They all have the opportunity to do so, and there are five other conservative justices in the mold of Justice Thomas. None of them decided to endorse his ideas of categorizing Twitter potentially as a common carrier or a place of public accommodation. So I think, you know, if you are one of these companies, and you're terrified of being subject to that label and facing a barrage of government regulations, you can take some comfort that at this point, it's just one Supreme Court justice. 

Ben Yelin: Now, I will say sometimes it's these minority opinions that eventually over time become majority opinions. And this is kind of his way - Justice Thomas' way of staking out that claim, perhaps trying to persuade some of his colleagues. I still think it's curious because it wasn't really necessarily relevant for the case that he was asked to consider, but I certainly think he brought up ideas that are definitely worthy of discussion. 

Dave Bittner: Could we look at this sort of coming at it from the opposite direction, from an antitrust point of view, that if these platforms are big enough and ubiquitous enough to be considered a common carrier, might that not be an indication that they're a little too large maybe not for their own good, but for our good? 

Ben Yelin: I think we could certainly look at it at that angle. That's not the angle that Justice Thomas decided to look at, partially based on the way he's always thought about First Amendment issues, but I think his greater concern is that these platforms are acting as quasi-government entities. They're playing a role that the government would traditionally be playing, and therefore they should be subjected to the rules that governments are traditionally subjected to. 

Ben Yelin: I would look at it - and I think many people would - and say, all right, he's commenting that Google has 95% of traffic on search engines and, you know, that Facebook has an extraordinarily high percentage of the user share, the market share on whatever it is that they do. And I'd say, yeah, that's a potential antitrust problem. We're not the only ones saying that. We've talked about lawsuits that are making their way through the courts to those effects. 

Ben Yelin: But I think Justice Thomas is coming at it from, you know, more of a libertarian angle and saying these entities are now stepping into the role traditionally played by government, somebody who's providing for a platform to exist in the first place. It's the government that builds the town square. It's the government that has control over public spaces. And in the mind of Justice Thomas, this is becoming a quasi-public space. And I think that's the reason he thinks that these entities are ripe for regulation. 

Dave Bittner: All right. Well, yeah, it's interesting for sure. And it is over on Ars Technica. It's titled "Clarence Thomas Blasts Section 230, Wants 'Common-Carrier' Rules on Twitter," written by Jon Brodkin. And, of course, we'll have a link to that in the show notes. 

Dave Bittner: My story this week comes from WIRED, and it's a story written by Lauren Goode. It's titled "I Called Off My Wedding. The Internet Will Never Forget." And this is a fascinating read. And it's about Lauren's own story of how she had - was in a relationship, a long-term relationship, had decided she was - she and her boyfriend decided they were going to tie the knot and get married. And then at some point, she had a change of heart and decided to call off the wedding. 

Ben Yelin: This - by the way, I'm admittedly a sucker for these stories, but it was kind of a bit of a tear-jerker. 

Dave Bittner: Yeah. 

Ben Yelin: I don't know about you, David, it's a sad story. I'll let you finish with the angle relevant to us. 

Dave Bittner: No, it is. And I think also made all the more so by Lauren's really powerful writing. You know... 

Ben Yelin: Absolutely. 

Dave Bittner: ...She's a good storyteller and so it makes it - it really, you know, strengthens her case and just brings it home. So she calls off the wedding, but then finds herself over time - as time goes on, because of the algorithms that social media apps have, they keep reminding her that her wedding's coming up, right? Or that, you know, she keeps getting ads for things like wedding dresses and caterers and, you know, invitations and all those wedding-related things. Photos pop up of her and her former fiancé. And it goes on and on and on. 

Dave Bittner: She actually reached out to the folks at Pinterest to sort of try to get a look behind the scenes at what is going on here. And it was fascinating to me that they refer to this as the miscarriage problem, which is that - most people's pregnancies run the full term. You have a baby, and now you have a child. And you go off with your new family. And every now and then... 

Ben Yelin: And you see all the ads for diapers, right? 

Dave Bittner: You see the ads for diapers, for cribs, for all that stuff. And the advertisers know you are in a buying mode, right? 

Ben Yelin: Yeah, exactly. 

Dave Bittner: So you are a very valuable person to put ads in front of. And it's the same with a wedding. A wedding is something where a lot of people spend a lot of money on things. And so the advertisers spend a lot of money to put their message in front of you. And in the miscarriage problem, you know, not every pregnancy ends with the birth of a child. People have miscarriages. And, of course, that is - can be an extraordinarily painful memory for the folks who have had to gone through that. 

Dave Bittner: And it's interesting that one of the technology folks that she spoke to said that they use some of their image matching technology - that if they see photos within hospitals, they try not to bubble those up to the top. They try not to remind folks of anything they recognize as having been in a hospital - either from the geotagging, from the actual, you know, stuff in the photo itself that they can analyze - which I thought was fascinating because who knows why you're in the hospital? Could be for good things. Could be for bad things. 

Dave Bittner: So the article points out that it does seem as though the platforms are making an effort to limit these sorts of things. But at the same time, it's sort of an argument against their interest because the bottom line for them is engagement. And it doesn't really matter if it's good engagement or bad engagement. It's engagement, and that gives them the opportunity to put ads in front of you, which is their prime directive. 

Ben Yelin: Raison d'etre. Yeah. 

Dave Bittner: (Laughter) Right, right. The article goes on and talks about, you know, the funny nature of time and our memories. What struck me was - let's say I have a painful, you know, memory from my childhood or something, you know, some trauma, you know, which certainly is not an unusual thing that we have some sort of childhood trauma. And maybe there's a photo in a photo album that reminds me of that trauma. And so I don't like to look at that photo, you know? 

Ben Yelin: Right. 

Dave Bittner: But it's up to me to pull that photo album off the shelf and start flipping through it. You know, I am in control of that situation, whereas on a lot of these platforms - you log into Facebook, and one of the things that pops up is, hey; here's what you were doing a year ago. 

Ben Yelin: Yeah. 

Dave Bittner: And it could be a horrifying memory, right (laughter)? I found this really fascinating. What's your take on it, Ben? 

Ben Yelin: It was a fascinating read. It was sad because I think this person was forced to relive some very painful memories. One of the interesting takeaways to me is that you can theoretically go through and erase your digital footprint. You can go through and, you know, delete cookies and, you know, wipe away your cache, do all those types of things. That is extremely time-consuming, especially, like - you know, you have a situation like this reporter, when she was - she had an account at every single website basically because it was a part of her job. She was a tech reporter, so she was interested in new social media apps and new gadgets. So going through and deleting your digital footprint is in and of itself very time-consuming and probably stressful. So, you know, in the real world, we can kind of suppress memories or remove things in our life that might remind us of something painful, as you say. 

Dave Bittner: Right. 

Ben Yelin: But making that effort in the digital space is extremely difficult and would force somebody to use work hours, potentially, to go through and scrub their own digital footprint. You know, so that's one of the reasons why it's so hard. 

Ben Yelin: I mean, I'm not necessarily sure I blame the Pinterests of the world for the predicament expressed in the story. I think they do kind of have to make a gamble that most people who sign up for, you know, baby bump registries and wedding registries do end up going through to childbirth and to weddings. 

Dave Bittner: Right. 

Ben Yelin: And, you know, that's a rational decision for them because they're going to profit from the vast majority of people who go through those life events. They're going to be able to get ads for whatever it is you buy after a wedding - cookware... 

Dave Bittner: Yeah, yeah. 

Ben Yelin: ...Honeymoons, et cetera - in front of faces and make a lot of money. So, you know, you kind of have this collective action problem where it's a very serious, really emotionally draining prospect for the few people for whom there's a bad ending to these stories, both in terms of pregnancy and weddings. But it's still a relatively small universe of people. So, you know, it's not necessarily worth it for Pinterest to curate its ads just to reach that relatively small market share. But that doesn't take away from, I think, the real pain she's expressing in this piece of being unable to scrub some of these bad memories. 

Dave Bittner: Yeah. One of the things she points out is that the platforms sort of - they give you what she refers to as the nuclear option, which is, you know, nuke it from space, you know? 

Ben Yelin: Yeah. 

Dave Bittner: Just wipe out everything. And that's not good enough. They really need to give you more granularity and control over what you do and don't see. And it's not an easy problem to solve. But what a fascinating side effect of the algorithms. This is one of those stories I'm going to be thinking about for a while. 

Ben Yelin: Yeah. I mean, she is an incredibly compelling writer. I mean, it reads to me like a - what could be the beginnings of a script for a very interesting movie, if somebody wanted to write it based on her experience... 

Dave Bittner: Right. 

Ben Yelin: ...Just because she's such a compelling writer. And she goes through her journey as a tech reporter and somebody who's gone through a very painful experience. So it's something I'm going to be thinking about for a long time, too. 

Ben Yelin: And I think we've talked about the problem of algorithms in a bunch of other more tangible contexts, whether the algorithm is going to lead to somebody getting arrested or, you know, being held liable in a civil court. We haven't really talked about it in how it affects people's memories or their general life experience. So I think it's just important to remember that behind every algorithm, there is some sort of human interest. This piece was a really important reminder of it. 

Ben Yelin: And I was on her Twitter account after she posted this the day before we're recording, and I think the praise for this piece was rather universal. I think it really struck a chord among this limited tech reporter community on the internet. 

Dave Bittner: Yeah, well, it's definitely worth your time. We'll have a link to that. Again, it's written by Lauren Goode, and it's titled "I Called Off My Wedding. The Internet Will Never Forget." 

Ben Yelin: And, Lauren, if you are listening to this, we would love to have you on our show. 

Dave Bittner: (Laughter) That's right. That's right. 

Dave Bittner: Those are our stories for this week. Of course, we would love to hear from you. If you have a question for us, you can call in at 410-618-3720 or send us an email to caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure speaking with Dean Gonsowski of a company called ActiveNav. And we were talking about data privacy legislation, data governance, some of the things that are going on in the federal space. Here's my conversation with Dean Gonsowski. 

Dean Gonsowski: Big states like California and some others want to take control of the privacy dialogue and are moving, you know, forward aggressively on their own because they're not seeing, you know, federal legislation come to pass. That's positive in the sense that there appears to be a mandate. The California situation is pretty illustrative in that, you know, there was - the CCPA was started basically to, you know, avoid a ballot measure around data privacy and some other pieces of legislation. And then they moved into the CPRA, which has sort of, you know, quickly outstripped the previous one. The public has moved into a position where they are, you know, demonstrably concerned about data privacy. 

Dean Gonsowski: And as the states attempt to regulate it, there's probably no other way to describe it than a hot mess because, if you're a company, you almost universally aren't conducting business just inside your state's borders. And so you run the very real and pragmatic challenge of having to just comply with, you know, a single state. You can't do that. So you'd have to look at a variety of different states. And there's not a high watermark. 

Dean Gonsowski: So as an example, you can't necessarily just look at California and say, California's regime is the most stringent and their definition of personally identifiable information is the most broad. You have to look at each state and understand how those states are going to apply, and you have to basically sort of pick and choose elements of the various states. 

Dave Bittner: Is there any hope of having any movement on the federal level here with, seemingly, Congress unable to get very much done, albeit that privacy is something that has rare bipartisan support? 

Dean Gonsowski: There's a lot of schools of thought here. One is that if a federal standard would come to pass, it likely - just in order to get bipartisan support, would likely be significantly weaker in various areas than many of the states. So you have a very real possibility that even if Congress could pass a piece of legislation and it took X number of months or years, it very possibly would be weaker than California or Virginia or any of these other states. 

Dean Gonsowski: You would have a scenario where a federal legislation would set the floor, and I think the floor would be pretty interesting. But I don't know that it solves the ceiling and the patchwork nature of what companies need to build to. And so to me, it helps a little bit, and it helps kind of create table stakes for what a basic, you know, U.S. domestic privacy policy must have, but I don't know that it solves sort of the patchwork component. 

Dave Bittner: Did GDPR raise that level of the floor itself for companies who are doing business internationally? 

Dean Gonsowski: It did. It created EU level - which, you know, is obviously pretty broad - level of data privacy and, you know, for the first time, got a lot of these provisions and theoretical doctrines about, you know, data processing and - or what some of these - you know, data minimization is a theory and a number of these things out there. 

Dean Gonsowski: I think one of the things that it did was it spurred people to become serious about data privacy, hire data privacy officer and start the process. But there was very little in the way of enforcement. And so we've talked to, you know, a number of our customers. And you'll see that people - I don't want to say they largely ignored it, but they were waiting for some level of enforcement. 

Dean Gonsowski: And I think you'll see here - whether it's California or other, you'll see stateside pieces of privacy legislation built in with much more performance and enforcement in mind. 

Dean Gonsowski: And so California, with the latest one, is a prime example. You know, they created an enforcement bureau. And that's one of the key things is in many ways, legislative action is only good if there's an enforcement mechanism because people don't, you know, typically comply out of the kindness of their hearts. There needs to be serious and real penalties for folks who aren't in compliance. And so I think that was one of the things with the GDPR. And other states have learned - and other regions have learned - that you need to have a pretty strong enforcement mechanism. 

Dean Gonsowski: You also get into the private rights of action, and there's differing views on that. That's another one of the things that the states will look at differently. The Virginia one, as an example, does not have a private right of action. And so I think, in many ways, the best way to do this is to create an enforcement body and give that - you know, staff it, give them a remit, and they can go out and enforce the mechanism. And then you'll see people become compliant over time because they want to avoid the penalties, and they know it's real. 

Dave Bittner: Are you seeing shifts in what are considered to be best practices when it comes to data governance? I'm thinking in light of some of the public breaches and the reputational damage that can come from those. Are organizations changing the way that they handle how they hold onto our data? 

Dean Gonsowski: They are. If you think back, you know, even five or 10 years ago, you know, breaches were relatively rare. There was a ton of stigma. And you tended to think, as a company, that you were either unlucky if you got breached, and you were kind of lucky and, you know, whistled past the graveyard because you hadn't gotten breached. And so there was less of a certainty that you were going to be breached. And I think there was a belief that you could build better and better perimeter defenses and, if you had good hygiene, you could sort of avoid a data breach. 

Dean Gonsowski: Now you're seeing the statistics come out that shows that breaches are a near certainty. The likelihood, I think, is close to 70% of all companies would be breached within the next two years. That realization, combined with the fact that a lot of companies already have been breached, you know, themselves, has made it so that there's an inevitability associated with a breach. And if you know you're going to inevitably be breached, it does change your thinking about how you maintain and govern and store and protect your data. And I think you see with a lot of these breaches - and by the way, you know, Equifax is one of ActiveNav's biggest customers, and we've helped them remediate, you know, their data breach. 

Dean Gonsowski: And, you know, without going into particulars, there was just a ton of data that existed within their data universe that they didn't have a good understanding of was partially kind of in the wild and not protected to the right degree. In many ways, the hackers ended up understanding that data better than the company did. That's a pretty difficult spot to be in. 

Dean Gonsowski: And so companies are now recognizing that they really do need to minimize the surface area of the data that they're holding. And that's why you see data minimization and these other concepts are raised with the GDPR and some of the other privacy regimes - is it's if you don't have it, you kind of can't lose it. And the fact that companies have retained it - over-retained data, in many ways, for decades, that all comes home to roost in a kind of perfect storm when you have, you know, data breaches and then new privacy regulations. 

Dave Bittner: Yeah, that element of it I find fascinating. You know, if you think about how inexpensive storage has become - and so I think for - while that change was happening, as storage got cheaper and cheaper there, I think there was a natural inclination for organizations to kind of be a packrat and just store and collect data because why not? And then as data got more and more value, as we could go through that data but, you know, could - we could sell it or we could use it for our own internal uses. But it seems to me like we've seen this shift, like, almost treating data as if it's a little bit radioactive, you know, like if you have too much of it in one place, bad things can happen. 

Dean Gonsowski: Yeah, that's a really good point. I think for the last certainly two decades, you know, you had companies retain to retain under this sort of big data premise, like if we can collect customer data of any variety and shape and size, let's collect it all. And then at some point in time in the future, it was a combination of - just my opinion - of big data and AI, which is we'll just get a lot of data, and then eventually we'll point tools at it, and it'll generate insights. 

Dean Gonsowski: In many ways, that's an understandable theory, but it presumes that there's never - it presumes storage is free, which it's not actually free - but it presumes storage is free, and it also presumes that there's no downside to creating these massive repositories of data. 

Dean Gonsowski: And what we're seeing with the breaches in the privacy legislation is it's actually now, you know, statistically more likely that you will get exposed and have regulatory liability versus the value that you would get. 

Dean Gonsowski: As you see, some of the Gartner analysts will talk about realized value of data. It's not a conceptual yes, one day I might get some value out of it. It's, are you actually leveraging it? And that's where a lot of the privacy legislation talks about, you know, a legitimate business use of the data. Are you using it for a specific purpose? If so, great. And if you're not, you really have an obligation at that point in time to get rid of it. 

Dave Bittner: So with the organizations that you all are working with, the ones who, in your estimation, are doing it right, are finding this equilibrium between what they store and what they do not and how they store it - security and so forth, are there any common threads there of the organizations that seem to have a handle on this? 

Dean Gonsowski: Yeah, there's a few. One is that many of them start out as heavily regulated companies, and so they have just a different level of hygiene and risk profile as it relates to regulatory compliance. And so they're - those companies are many ways ahead of the curve, and they typically have staffing and budget and policies and programs. And so they're trying to mature their existing, you know, position and roll with the punches of the privacy legislation. 

Dean Gonsowski: I think on the other end of the spectrum, you have a number of newer companies that might be more B2C companies that typically didn't have regulatory issues. And what's interesting for them was that while they're not heavily regulated, they all of a sudden have just mountains of consumer data. 

Dean Gonsowski: If you just, you know, pick somebody like an Uber or something, that typically wouldn't be in the regulatory bull's eye. But if you think about all the data that they have for drivers and riders and geolocated data, et cetera, all the sudden, you know, they're in a really interesting spot with data privacy concerns. 

Dean Gonsowski: And so the good news for companies like that, they tend to be a bit newer, they often will have, you know, newer technology stacks. And so it's harder because they're starting from a lower basis of, you know, sort of regulatory hygiene, but they tend to have less baggage than the pharmaceutical or oil/gas or somebody who's been around for 50 years. You know, a lot of those companies just really struggle with - they're more mature, but they have so much legacy debt to deal with in terms of, you know, data and people and division. 

Dave Bittner: Right, right. I mean, is it fair to say that, you know, newer companies can build these things in in a foundational way rather than having to kind of, as you allude to, you know, sort of graft it on to legacy structures? 

Dean Gonsowski: Yeah. I think of it as - you know, information governance as far as these two interrelated issues. One is you have to drain the swamp of the data and clean the data that you've stored. And in particular, you know, file-share data and on-premise data is just a massive headache for almost all of our customers. But if you're a newer company, you probably don't have that. And so you don't really need to worry about draining the swamp, you just need to dam the stream. So you need to prevent the data, you know, aggregation, and you need to stop that over-retention of information in the first place. And I think because of that, that's a little bit easier of a position to be in. 

Dean Gonsowski: And I think if you're a newer company, then, you know, data - we talk about, you know, data minimization by design. It's much easier to sort of, as you're collecting it - you got a form that collects fields of information for your customer. You know, if you're asking for age, birthday, serial number, you know, sex, political party, you need to ask yourself why you're asking for those things because once you capture them, you have an obligation to protect them. And so I think it's very easy at that point to minimize what you're collecting to elements that you're only going to use for a legitimate business purpose. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: The most interesting part to me was when he talked about what one of the pitfalls would be of federal data privacy legislation - is that in order for it to be bipartisan, it would have to be watered down to the point that it would probably be weaker than similar laws that we've seen in California and Virginia. Because of the way our federal system works, that federal law would end up preempting a lot of state laws in that similar area. 

Ben Yelin: So I know we've talked about the need for federal data privacy legislation, particularly for the purpose of uniformity, but I think the point he makes here is really valuable. Be careful what you wish for because it can end up eliminating some of the stronger protections that you see at the state level. So I thought that was very interesting. 

Dave Bittner: Yeah, that is fascinating because we hear time and time again, you know, companies talk about - well, and I guess this really kind of started with GDPR. Because you have very strict rules in one zone, in one geographic area, a lot of times the easiest thing to do is to calibrate how you do business to that strict standard and just apply it everywhere, and then you know you're safe. 

Dave Bittner: And if we think about that from a state-by-state point of view, there are organizations doing that as well. As you say, this could water that down. If we come up with something federal that's less restrictive, it could end up in the end - I suppose it depends on your point of view - either being better for everyone or worse for everyone, right? 

Ben Yelin: Yeah. And we see that in many other areas of the law, where sometimes the federal government will purposely kind of occupy the field of legislation in a particular area in order to preempt stricter state laws. So, you know, I think that's something we really have to be conscious of and watch out for as, you know, data privacy legislation gets considered. 

Ben Yelin: It's not an either/or proposition. I think whether, for the purposes of data privacy, it would be valuable to have federal legislation depends on how robust that legislation is. And when we're talking about a very narrowly divided Congress that can't agree on anything, you know, if you end up finding some sort of accord among Democrats and Republicans, it's very likely going to be something that's been watered down by compromise. So I just thought that was a really interesting point. 

Dave Bittner: Yeah, yeah. Well, again, our thanks to Dean Gonsowski from ActiveNav for taking the time to join us. Really interesting conversation. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.