Caveat 1.11.24
Ep 201 | 1.11.24

Combatting the privacy pandemic.


Chris Smith: At the time, I was in a romantic relationship with someone and had no idea that one of her goals and her collaborator's goals was to compromise my entire digital existence and my family's existence. So, for 18 months, my iCloud was compromised. And every time we tried to address it, we could never fully figure out how they compromised me to that level.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben.

Ben Yelin: Hello, Dave.

Dave Bittner: On today's show, I've got word that the Federal Trade Commission is updating its Health Breach Notification Rule. Ben has the story of an upcoming court case relating to online video privacy. And later in the show, my conversation with author Chris Smith sharing his book "Privacy Pandemic: How Cybercriminals Determine Targets, Attack Identities, and Violate Privacy?and How Consumers, Companies, and Policymakers Can Fight Back." While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice or official legal advice on any of the topics we cover. Please contact your attorney. All right, Ben, we've got some good stories to share this week. But before we do, I just wanted to make note that, you know, last week we were so excited to have our guest Caleb Barlow join us that we neglected to mention that it was our 200th episode.

Ben Yelin: Ding, ding, ding, ding, ding, ding, ding. The confetti -

Dave Bittner: So -

Ben Yelin: Just went off in the studio.

Dave Bittner: Exactly. Yes. So, congratulations to us. I think 200 episodes is quite a milestone. And it's been my pleasure to share the microphone with you all these years.

Ben Yelin: Me too. And, yeah, we're going on 200 episodes, about four and a half years. And -

Dave Bittner: Yeah.

Ben Yelin: Hopefully, we can do 200 more, maybe get up to 1,000.

Dave Bittner: There you go.

Ben Yelin: And, yeah, we have our sight set high. But thanks to all of our wonderful listeners and good time to leave a five-star review. Right, Dave?

Dave Bittner: There you go.

Ben Yelin: Yeah.

Dave Bittner: See? I like it, I like it. All right. Well, let's jump into our stories here, Ben. You want to lead things off for us?

Ben Yelin: Yeah. So, I got my story from the Electronic Frontier Foundation, a frequent source of our stories, a piece by Aaron Mackey. The Electronic Frontier Foundation, among others, is writing to a federal court in Northern California asking them to uphold a federal law that protects online video viewers' privacy. So, there's this law that predates the Digital Age called the Video Privacy Protection Act. Basically, that law requires any service that offers video to the public, so, in the old days, that would have been a video store -

Dave Bittner: A rental - video rental store. Right?

Ben Yelin: Exactly.

Dave Bittner: Okay.

Ben Yelin: With our, you know, giant VCR old-school videotapes. Basically, the law requires that the business or service offering the videos to the public get customers' written consent before disclosing that information to the government or a private party.

Dave Bittner: Right. Now, I am old enough to remember what triggered this law.

Ben Yelin: Yeah, you've mentioned this a number of times. And I thought of you when I saw this story -

Dave Bittner: Okay.

Ben Yelin: 'Cuz you do get a kick out of it.

Dave Bittner: Yeah.

Ben Yelin: You want to share why, why this law got put in -

Dave Bittner: Well, I mean, back in the old days when we had video rental stores, and, Ben, you're probably too young to know the joy of cruising your local video store on a Friday night to try to decide what video you were going to take home, hoping that someone hadn't already gotten to that copy of "Back to the Future" or --

Ben Yelin: Oh, I'm not that young. I remember, yeah, I would go every week and -

Dave Bittner: Yeah. So, the problem was some politicians figured out or I guess enemies of politicians figured out that they could go to their local video store and ask the person what videos their - the politician that they did not like had rented. And, so, when it was revealed that, you know, Senator McGillicuddy had rented, you know, "Debbie Does Dallas" 10 times in a row, this was not good for Senator McGillicuddy. And, so, you know, the quickest way to get a law enacted is to have it directly affect lawmakers. And that's exactly what happened.

Ben Yelin: Incidentally, haven't we lost our sense of shame? I feel like that wouldn't happen now.

Dave Bittner: Right.

Ben Yelin: I feel like the person would just go and either deny that it happened or be like, "You know what? I did watch 'Debbie Does Dallas' and I liked it."

Dave Bittner: That's right, that's right.

Ben Yelin: It was fantastic and what are you going to -

Dave Bittner: Yeah.

Ben Yelin: Do about it?

Dave Bittner: Yeah. I think you're right.

Ben Yelin: But this law was enacted before people had lost that sense of shame. And federal courts have applied it to digital videos as well. So, the lawsuit that's being considered in a federal court in California was brought by users of the social media service Patreon. Are you a Patreon user?

Dave Bittner: I have been, certainly. We had a CyberWire Patreon back in the day.

Ben Yelin: So, my understand - I'm - I don't use it, but it's basically like a social media aggregator where you can - somebody can have their own Patreon page and -

Dave Bittner: Yep.

Ben Yelin: Post links to all their different social media sites.

Dave Bittner: Right.

Ben Yelin: And the lawsuit alleges that Patreon has been sharing video viewing habits of its users with Facebook. And this would be a facial violation of this Video Privacy Protection Act because the law bans sharing this information even with private parties, not just the government. So, in response to this lawsuit, Patreon is basically saying that the law itself is overbroad. There is this doctrine called the overbreadth doctrine that if the application of the law would be an overbroad burden on First Amendment associational rights, even if specific strands of the law are constitutional, the entire law would be struck down. And I think Patreon's argument here is by prohibiting it from even with a customer's consent sharing information through these - to these other social media sites would inhibit its associational activities and would be an inhibition on its business activities. So, the Electronic Frontier Foundation has decided to join this suit because they think this is a misapplication of the overbreadth doctrine. It's funny because they are like the world's foremost defenders of the overbreadth doctrine. They've signed on to every overbroad doctrine case on the other side basically -

Dave Bittner: Okay.

Ben Yelin: Saying that there are all these overbroad laws seeking to restrict activity.

Dave Bittner: Uh-huh.

Ben Yelin: This is an organization that cares about online privacy. So, in every case, they're basically saying this is an overbroad law. That's what actually the doctrine that was used to invalidate every part of the Communications Decency Act except for Section 230. But their argument here is that Patreon's simply wrong. The VPPA and their view advances the First Amendment and privacy interests of users because it ensures that they can watch videos without being chilled by government or private surveillance.

Dave Bittner: Huh.

Ben Yelin: So, I think this is really a chilling effect argument that if Patreon users know that their video viewing habits might end up in the hands of social media sites, especially social media sites that aren't responsible about protecting user privacy, not to name any particular such companies -

Dave Bittner: But we - but we're talking about Facebook.

Ben Yelin: Yeah, exactly. I mean, that's really the underlying truth here -

Dave Bittner: Right, right.

Ben Yelin: Is if this were a more exacting privacy conscious social media company, it wouldn't be as big of a deal. But we're talking about Facebook.

Dave Bittner: Right.

Ben Yelin: And, basically, the idea is that if people were concerned that their videos might be leaked, they might not watch offensive content. Now, you know, that might just be pornography. Fine. But it might be politically disfavored content.

Dave Bittner: Yeah.

Ben Yelin: So, content produced by religious minorities or political minority groups. And that would really have a chilling effect. And, you know, when you have presidential candidates proposing that people should get verified - personally verified in order to use social media, you can kind of connect the dots here. There's this parade of horribles where if Patreon is allowed to share this information with social media companies, like Facebook, who are kind of fast and loose with their data, that might end up in the hands of an overactive government. They could use it for prosecution. Or if such an overbroad law, such as the one proposed by presidential candidate Nikki Haley which would be to require verification for any users of social media services, that might end up leading to a major inhibition on associational rights and free speech rights. And that's something that would simply be unacceptable far more than any burden on Patreon itself. So, it's a really interesting case and a really interesting brief. You can probably tell I'm more persuaded by the EFF argument here than the Patreon argument. But I'm very curious to see how this turns out. This case is going to be heard sometime in the next month in this district court.

Dave Bittner: So, help me understand here. Is the argument that even if Patreon gets permission from its users in their EULA to share this information, which I'm guessing they have, that this law overrides that and says you can't do that even if you get permission?

Ben Yelin: Yeah, I think it comes down to whether it complies with the statute's provision that you obtain the customer's written consent. And it might not be explicit in the EULA that they're doing this.

Dave Bittner: Okay.

Ben Yelin: It might be implied in a way that the consent offered by the user is not meaningful. I think that's the allegation here.

Dave Bittner: Okay.

Ben Yelin: So, presumably, you could contract it away in the EULA, but it would have to be obvious enough to an average user that they would - I mean, a normal, reasonable person would understand what they were contracting away.

Dave Bittner: Yeah.

Ben Yelin: And I think the allegation here is that Patreon has not done that. They don't publicize in any meaningful way that a normal person can see that they're sharing some of these videos with Facebook.

Dave Bittner: Do you think we're ever going to reach a point where EULAs can be anything resembling reasonable?

Ben Yelin: I think we're in a tough place with EULAs. There's kind of a misperception in the nonlegal world that you can contract anything away with a EULA.

Dave Bittner: Right.

Ben Yelin: You can't. Courts will look skeptically on them, especially if one of the things you're contracting away is somebody else's First Amendment associational rights. You know, certainly, a EUAL helps and gives some legal authority to certain restrictions. But the fact that most EULAs are things that nobody ever reads and that they're 100 pages long and we're just trying to click through them, I think weighs against a lot of these providers because people aren't offering meaningful consent to their services.

Dave Bittner: Uh-huh.

Ben Yelin: And I think courts look disfavorably upon that. And I think they're going - they're actually going into these EULAs and saying, "Well, if I was a reasonable person trying to gain information on what this app is going to share with this other app, I don't think I would be able to do so." This is in legalese. This is not something that a consumer would understand. And, so, I think there's a separation between how the courts see this and how the industry sees it where to save ourselves from trouble, just put something in the EULA that'll -

Dave Bittner: Right.

Ben Yelin: You know? That's our get out of jail free card.

Dave Bittner: Yeah.

Ben Yelin: Sometimes it is, but very frequently it's not.

Dave Bittner: Do you think we could legislatively make EULAs less necessary? In other words, if we had a ro - if we had a GDPR, it seems to me like a lot of the things that are in EULAs would be moot.

Ben Yelin: Yes.

Dave Bittner: So, is that a potential avenue?

Ben Yelin: Yep. But unicorns could also fly out of the sky tomorrow morning and I'm not betting, you know, my life savings on that, so.

Dave Bittner: Right.

Ben Yelin: Yeah, I mean, we've gone through this period where there have been hints that the federal government might enact a data privacy law. There is that vacuum. We've gotten laws at the state level. I don't anticipate it happening anytime soon. So, in the absence of something like a GDPR, it is incumbent on these companies to establish legal rights. And courts aren't always willing to defer to the companies. And I think that's wise. Like I think there are rights that users have that cannot be contracted away in a confusing EULA that the average person would never be able to understand. But, yeah, I mean, I do think a lot of what is in your standard EULA language would be moot if we had something like GDPR because that lays out the consumer's rights, obligations, the provider's rights and obligations in a way that's protected by law and it's not something that's subject to these types of uneven contracts.

Dave Bittner: Yeah. All right. Well, we will have a link to that story in the show notes here. My story this week comes from the folks over at Lawfare. This is an article written by Justin Sherman and Devan Desai. It's titled "The Most Critical Elements of the FTC's Health Breach Rulemaking." And this is really an article about how the Federal Trade Commission is updating their Health Breach Notification Rule, the HBNR, trying to bring things up to date here that they're acknowledging that health tech and how we handle data is evolving. And that, of course, you know, HIPAA, which goes back to, what, 2010 I believe -

Ben Yelin: The rule here does.

Dave Bittner: Yeah.

Ben Yelin: HIPAA itself goes back to the 1980s.

Dave Bittner: Oh, okay.

Ben Yelin: Yeah.

Dave Bittner: That HIPAA, you know, does - while great at covering a lot of things, is incomplete in our digital landscape here. So, some of the changes that they're seeking to make would focus on health data privacy, it would cover more entities, be more specific about covering things like apps, telehealth services, you know, data brokers, digital advertisers. All things that have kind of come to be, you know, in the time in between when the original Health Breach Notification Rule and HIPAA itself were brought into action here. They're trying to clarify the regulatory guidance, make it a little easier for, you know, smaller companies who may not have a big legal team here. I think it's interesting here that - like a bigger picture here, it's kind of relating to what we were talking about in our - in your story, is the FTC is taking action here I believe partly because of the inaction of Congress. And the FTC has the ability to take action when Congress is unable to or unwilling to. Do you think that's a fair take?

Ben Yelin: It is. I, basically, have a couple of points on this to get across. One is I think the two biggest innovations in this space have been health care app - related applications. And, honestly, this has come to light frequently with period tracking applications in the wake of the Dobbs decision, that that's actually collecting a lot of very personal information that could be used against people who are in states that are hostile to abortion rights. So, if you didn't know about it before the Dobbs decision, I think you certainly know about it now.

Dave Bittner: Right.

Ben Yelin: And then telehealth. Telehealth was sort of a thing prior to COVID. There were certainly telehealthesque services, particularly in rural areas. But I think like the mass use of telehealth and telehealth as a replacement for going into the doctor's office is really something that's taken off in the last four years. I know Maryland, the state that we're in, didn't actually allow telehealth appointments prior to COVID. The governor in an emergency regulation waved that requirement. And people liked it enough that telehealth has stuck around. I know - my wife is a provider and now a good portion of her appointments are via telehealth.

Dave Bittner: Yeah, the sky didn't fall. Right

Ben Yelin: Exactly. And why not do it? I mean, particularly when we were in the midst of a global pandemic. If you can get health services just to obtain a prescription, you know, or to show your doctor your pink eye -

Dave Bittner: Yeah.

Ben Yelin: Or, you know, some other infection on your body through the video, why risk going to the office and catching a communicable disease? So, I think that's really the backdrop here. In terms of the legal concept, this highlights something that's really important in administrative law. And I don't mean to get too much in the weeds here, but there is this thing called the Chevron Doctrine. It comes from a 1980s case called - I believe it was Chevron versus the Natural Resources something. Basically, that decision held that if statutory language is extremely clear, then administrative agencies have to defer to those stat - to that statutory language. But if the language is ambiguous, then the administrative agencies have some deference to interpret the language as they see fit to meet modern circumstances. So, Chevron has allowed federal agencies like the FTC to enforce actions like this, even when Congress stays silent, because this is a reasonable and not arbitrary interpretation of HIPPA and some of the laws enacted concurrent with HIPPA. The concern here is that there are at least two or three Supreme Court justices who are extremely hostile to this notion of Chevron deference. I know Justice Neil Gorsuch, who was confirmed in 2017, has written extensively about how he hates Chevron deference and that it's been bad for the balance of power between Congress and administrative agencies. And if that deference is struck down, then agencies wouldn't be able to take action like this because this is something that's not authorized by the clear language of the statute. So, I know that seems very legalese in the weeds, but I think that could be very relevant for something like this where we need to give agencies some deference to improve regulations to meet modern circumstances.

Dave Bittner: Yeah. This article points out that the FTC back in February of last year enforced this rule for the first time. They used it against prescription drug provider GoodRx and then again in May against a fertility tracking app called Premom. And both of those companies were alleged to have shared individuals' health data with third parties without proper disclosures. And, I mean, again, getting back to our previous story.

Ben Yelin: Right, right. And I think it's good to see some administrative action here. It's very hard for individual users to know that their data has been released to a private entity and sold to data brokers. Certainly, if that data is released, it's hard for an individual user to sue a, you know, giant conglomerate tech company. People just simply don't have the resources. So, I think it is incumbent upon agencies like the FTC to have these administrative fines and consequences for these companies that are being fast and loose with people's data. And we're talking about health data. It's stuff that's very personal. I mean, I think that's what makes health data so unique.

Dave Bittner: Is it fair to say that in general HIPAA has been considered a success?

Ben Yelin: I think so. I think HIPAA is limited in its application to covered entities.

Dave Bittner: Yeah.

Ben Yelin: And it seems like sometimes it takes a while for administrative agencies to catch up with the kind of health organizations that people are actually using. So, online health portals came online in the 2000s and it took a while for federal regulators to make those health - online health portals be covered entities under HIPAA. And that's just kind of the nature of the law. But I think the purpose of HIPPA, obviously, was sound. And despite kind of the mass understanding of what HIPAA is and what it does, I do think it's been worthwhile and it's been a success.

Dave Bittner: Suppose the Supreme Court does strike down, you know, an organization like the FTC's ability to come at these sorts of things. And, so, then it kicks back to Congress. Right?

Ben Yelin: Right.

Dave Bittner: And then so if Congress does what Congress does best -

Ben Yelin: Nothing.

Dave Bittner: Lately, which is nothing, does that leave us in a bit of a wild west situation here where there's no avenue for enforcement?

Ben Yelin: Pretty much. And that's what a lot of people want. I mean, it would defang administrative agencies.

Dave Bittner: I see.

Ben Yelin: If you are hostile to administrative agencies and you think they're butting in on your business, which a lot of people do -

Dave Bittner: Yeah.

Ben Yelin: Probably like half the country thinks that.

Dave Bittner: Right.

Ben Yelin: Then you'd be in favor of getting rid of something like Chevron deference. That's why a lot of people are in favor of it. But, yeah, that would be the obvious consequence there is there would be no ability for these agencies to be flexible with vague language in federal statutes. Congress would have to step in. And it's - as we've mentioned many times, it's hard to pass a piece of legislation in Congress, even if you have majority support. Congress is a cartel. What bills come to the floor is decided by leadership very frequently, even if there is majority support. So - and they're also bound by arcane Senate rules that introduce these time constraints on the things that you can do in a given congressional session. So, yeah, I mean, it would be - I think it would be a major change and it's something we need to look out for over the next few years.

Dave Bittner: All right. Well, we will have links to both of these stories in the show notes. And, of course, we would love to hear from you. If there's something that you would like us to consider for the show or you have some feedback, you can email us. It's Ben, I recently had the pleasure of speaking with author Chris Smith about his book "Privacy Pandemic: How Cybercriminals Determine Targets, Attack Identities, and Violate Privacy." Chris himself was a victim of some digital identity theft. A really interesting conversation here with author Chris Smith.

Chris Smith: So, back in 2018, I was running global business development for a hot new crypto company called Civic. And, at the time, I was in a romantic relationship with someone and had no idea that one of the - one of her goals and her collaborator's goals was to compromise my entire digital existence and my family's existence. So, for 18 months, my iCloud was compromised. And every time we tried to address it through, you know, working with my tech guys at Civic, you know, hiring digital forensics specialists and meeting with people like the FBI, we could never fully figure out how they compromised me to that level, until we learned that my mom's house and her router was the attack service essentially. And, so, every time I'd go home and visit my family, it would happ - it would start to happen every couple weeks after that. And we couldn't figure out why until we learned about my mom's router being compromised. So, I decided to write a book for, say, the 95% of the consumer population that never changes their iCloud passwords or updates their security and privacy settings and to use my personal story as a way to guide readers through what the current threats are in the world of cybercrime and identity theft.

Dave Bittner: Do you suppose that you were targeted specifically or was this an opportunistic sort of thing?

Chris Smith: So, it's very clear based on the data and the forensics that my team and I have now, you know, reviewed, you know, a couple years ago that they compromised me three weeks before I joined Civic. So, it's very clear what their goal was was to try and embed themselves into my iCloud, which was, you know, at the time my personal iCloud connected to everything at Civic, which I never do anymore. So, I believe it was opportunistic because I was joining a very hot crypto company that had just raised 33 million in an ICO and it was - I think it was more opportunistic.

Dave Bittner: And it strikes me that there's the added element of having a romance scam thrown in as well.

Chris Smith: Yeah, I mean, I think that was one of the hardest things that I needed to get over. But once we realized that they had compromised my mom's house, I got over it pretty quickly and got to action in terms of wanting to take them down and hired experts in privacy and digital crisis management and talked to, you know, former CIA F - you know, NSA type people. And, yeah, it wasn't easy, but it definitely propelled me into being more active about doing everything I could to take them down. But, you know, the one thing about that is is that I'm definitely not the first person that's ever experienced something like this and I won't be the last. And I read about - I see articles all the time on LinkedIn or, you know, news outlets about elderly people losing everything because they were scammed or, you know, these pig butchering type scams that are happening a lot in crypto or were. And, so, I - you know, it was a difficult thing to get over, but, you know, I'm not the first and I won't be the last, unfortunately.

Dave Bittner: It strikes me that, you know, you certainly being a person of above average knowledge and capabilities when it comes to things like cybersecurity, that's really a cautionary tale for the rest of us, you know, folks who don't have that particular skillset that you do.

Chris Smith: So, that's a really important point. So, I knew virtually nothing about cybersecurity until this happened. And while I worked in the technology space and I understood things like why multifactor was important and, you know, why needing to update your passwords and your credentials were all important, right, like I was probably, like you said, above average in terms of my security knowledge, but from a cyber perspective, I had very little knowledge. Right? And over the last five years, I have met with some amazing people that have helped educate me. And I'm looking at it from a very different perspective because I'm someone that has experienced a cyberattack and identity theft. And the only way that I know how to deal with things like that is to throw myself into what I'm experiencing. And, so, the amount of people that I have engaged with either to hire to help me do forensics or to help me understand how and why this happened has really opened my eyes to what's really happening around the world and what we need to do to better protect the everyday American and the everyday global consumer. So, I didn't have true cyber experience, but I've learned from some amazing people in the industry.

Dave Bittner: Can you take us through the experience, the successes, the frustrations, you know, what you learned about our ability to counter this sort of thing?

Chris Smith: So, I think one of the biggest challenges that I see is, you know, most victims or targets of identity theft or someone that's been a part of a breach, like at 23andMe, which in my opinion is a really horrific breach based on the fact that people's DNA and the bad actors were building lists of types of people, is there's really no retribution most of the time for targets or victims of these crimes because you generally do not know who you are going to go after. And while I had a very good sense of why I was being targeted and by who, they were still never held to account, which is part of the reason why we had to, you know, change names and locations in the account of privacy pandemic. But I think that's the biggest challenge is you have companies like Okta or 23andMe that have been in the news a lot lately that come out and say, "Oh, it was only 1% of our customer database," and then it turns out, like in Okta's case they blamed an employee for the breach, and then it turns out that their entire customer database was breached. And that's I think probably close to a billion username and password credentials that are out there. And, so, there's no warning system for us as the consumer or the user of these products to be able to fight back. And, so, I think that's one of the biggest challenges that I see as someone that's coming into the industry and building a new company to help solve that gap in our opinion.

Dave Bittner: And one of the things that you touch on in the book is this notion of self-sabotage, as you describe it, and how the criminals rely on that. Can you take us through what you mean by that?

Chris Smith: Yeah. So, I think, you know, in most romantic relationships, you start to build trust with your partner. Right? And, so, I think that we start to reveal a lot of information about ourselves to the people that we're in love with or, you know, people that are really close friends of ours. And what I've come to the realization is is that you have to be extremely careful of the amount of information that you share, not because you shouldn't feel like you can't trust your partner or, you know, your best friend or whatever, but you just never know their motives. But, also, what I've learned is you never know if they might be targeted to get to you. Right? And, so, that's a very scary reality that I stepped into because of what happened to me. But after speaking to lots of, you know, people in this world, like Eva Velasquez, who was very kind and wrote the forward to my book, she's the president and CEO at the Identity Theft Resource Center, I've learned so much from Eva and her team about family fraud and how kids get their social security number destroyed before they even have a chance before they turn 18 because a parent or a relative has been leveraging their social security number. So, I think when it comes back to self-sabotage it's just really understanding who has access to your - as simple as your code to your phone, you know, the pin to your code - on your phone or on your device, and just making sure that you stay on top of it because you just never know where a leak might be. And it might be someone very close to you.

Dave Bittner: I'm curious, you know, based on everything that you've been through, do you have any specific advice for the policymakers out there?

Chris Smith: Well, I think the Biden administration is doing what they can. Obviously, we need to take it a step further. So, one, you know, the recent executive order on artificial intelligence I think is a really good step, especially what's been happened recently with, you know, Sam and OpenAi and everything that's going on there. But, two, I think we need to have some federal regulations in place around two things. One, if a company does get breached and it impacts millions of Americans or millions of global consumers, I think there needs to be stronger - a stronger stance on when that happens and how the company needs to be able to report on what happened and be able to make whole anyone that may experience some sort of cyberattack or identity theft because of the breach. I mean, there's like 24 billion username and credential combinations available for purchase on the dark web and that didn't happen because we all decided to put all of our information out there. So, I think because of what states like California is doing around privacy and, you know, other states are looking at doing or having similar, you know, laws within their state, I think the government needs to come in and say, "How can we have a significant role if a company does get breached?" I think the second thing that we need to think about as consumers is being our best ally when it comes to protecting our digital life because the amount of breaches that are happening on a annual basis is appalling, that we as consumers want to be able to trust the businesses that we're, you know, doing business with and if they're not protecting our digital identity, that's a problem. So, I think that there needs to be an extreme awareness on the consumer side. And I do believe that organizations like the Web3 Identity Coalition, the Digital Chamber of Commerce, they are trying to do everything that they can to educate the policymakers so that they understand how big the threat is to the everyday American.

Dave Bittner: What are the takeaways for you? What do you hope that people who have read the book will come away with?

Chris Smith: The biggest thing that I'm hearing from people that have read the book is their own awareness about what they haven't been doing to protect themselves. So, the first thing that I hope that people take away from the book is having a better understanding of their own digital privacy and security and how that impacts their daily life. Two, I hope they have a better understanding of what's really happening every time we as consumers log on to the internet and even have conversations like this and what our data - where our data goes, how it's being used. And, finally, I just think helping them understand how they can become more vigilant in their day-to-day lives about the choices that we make when it comes to signing up for this service and how could this potentially impact my digital privacy and security. And just being more aware because most of the people I assume that are going to be listening to this podcast are what I would deem as the 5% of the world that understand cybersecurity and, you know, digital privacy in a very specific way, but the everyday consumer may not be thinking about the fact that they haven't updated their iCloud password since 2014 and how that could potentially impact their entire digital life.

Dave Bittner: Ben, what do you think?

Ben Yelin: I just really admire Mr. Smith from using his own experience to pass on advice to others, talking about how much work it was to get his life back after all of these fraudsters stole his private information and there are little ways that they can attack it even to this day. It's been, what, several years -

Dave Bittner: Yeah.

Ben Yelin: Since this has happened. So, it's just encouraging for me to see him use this experience to inform the rest of us on what the risks are and how we can ameliorate those risks.

Dave Bittner: Yeah. Really a harrowing situation to be in. And we certainly appreciate him reaching out to us and sharing his story. Again, that's author Chris Smith and the book is titled "Privacy Pandemic: How Cybercriminals Determine Targets, Attack Identities, and Violate Privacy." We appreciate him taking the time. That is our show. We want to thank all of you for listening. A quick reminder that N2K's strategic workforce intelligence optimizes the value of your biggest investment, your people. We make you smarter about your team while making your team smarter. Learn more at Our executive producer is Jennifer Eiben. The show is edited by Tré Hester. Our executive editor is Peter Kilpe. I'm Dave Bittner.

Ben Yelin: And I'm Ben Yelin.

Dave Bittner: Thanks for listening.