Caveat 9.29.22
Ep 143 | 9.29.22

Breaking down the Gramm-Leach-Bliley Act.

Transcript

Bob Maley: The new updates that they're proposing - it brings it a little bit more into today.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses a new lawsuit filed against a local public utility for providing user data to law enforcement. I've got the story of California's Age-Appropriate Design Code Act. And later in the show, we've got Bob Maley from Black Kite. He's here to discuss the Gramm-Leach-Bliley Act, which is federal regulation that requires financial institutions to safeguard sensitive customer information. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. All right, Ben, let's jump right into our stories here. Why don't you kick things off for us? 

Ben Yelin: So mine comes from the good folks at the Electronic Frontier Foundation. And it is a lawsuit being filed against the city of Sacramento, Calif., the capital of my native state, and specifically the Sacramento Municipal Utility District, which is the public utility in the city of Sacramento. So the allegation is that the utility has been giving a treasure trove worth of user data, sometimes from entire ZIP codes, to law enforcement to help with drug arrests. Basically, the idea is with smart meters, you can get a very accurate reading of somebody's electricity use, within 15 minute increments. And so if something is suspicious, then that's something that law enforcement might be interested in. 

Ben Yelin: I guess the logic here is that if you're using grow lamps, that's going to use an untoward amount of electricity. That will set off alarm bells among the local law enforcement agency. They might want to initiate either prosecutions or civil fines. 

Dave Bittner: Right. Or you could just be mining cryptocurrency, right (laughter)? 

Ben Yelin: You could be mining cryptocurrency. One of the people who is named in this lawsuit said that, hey, I'm a senior citizen. I keep all the lights on because I'm concerned about my own safety. 

Dave Bittner: Yeah. 

Ben Yelin: And I just happen to use a lot of electricity. So it's not necessarily that people are using grow lamps. 

Dave Bittner: Right. 

Ben Yelin: It also seems to me, and it certainly seems to the Electronic Frontier Foundation, that Sacramento put a policy in place to bust people for growing small amounts of marijuana that are still above the legal limits in California. 

Dave Bittner: OK. 

Ben Yelin: And they've done so to raise revenue for the city. 

Dave Bittner: Ah. 

Ben Yelin: They put in a program in 2017. They've raised several hundred million dollars through these fines. 

Dave Bittner: So it's a marijuana speed trap. 

Ben Yelin: Exactly. I mean, it's basically a speed camera. 

Dave Bittner: (Laughter) Right, right. 

Ben Yelin: So the lawsuit's being filed on behalf of several groups. One of them is an interest group representing Asian Americans. And that's another very interesting element to this lawsuit, is there is an allegation that this data is being selectively sought by law enforcement to target Asian American communities. And they introduce some pretty compelling evidence from members of the public utility and members of the Sacramento government that that seems to be the intention of the policy. They are disproportionately targeted. Something like 80% of the data comes from households with names that are of Asian descent. So it's both a discrimination lawsuit, but more interesting for our purposes, it goes after California's version of the Fourth Amendment against unreasonable searches and seizures, the idea being that you have a reasonable expectation of privacy in your own utility data. 

Ben Yelin: Setting aside the discrimination angle here, which I think is real, I unfortunately kind of question the premise that you do have a reasonable expectation of privacy in your utility data. This feels like a classic third-party doctrine case to me, where you are, just by the nature of the transactions that you engage in in your house - turning on the lights, turning on your grow lamps, blasting music, having six Amazon Alexas, whatever - you know that the utility company is going to keep a record of that. They're keeping a record of their usage. So you either are aware or should be aware that that data is being submitted to that utility, and you've lost an expectation of privacy in that data once it's been surrendered to the utility. And in that sense, I don't really think this is a Fourth Amendment search, because if there's not a reasonable expectation of privacy, there's no search. 

Ben Yelin: What I think is that that's - shows the problematic nature of the third-party doctrine in the digital age. The fact that this utility can target specific zip codes and collect data that could easily be incriminating but doesn't necessarily indicate any type of criminal activity or criminal intent is bad and dangerous. And the fact that it's being used as kind of a money - or kind of a money-grubbing scheme by the local Sacramento government makes it even more concerning. The other question is whether this is legal under California statute. So there is a relevant California statute that says public utilities generally shall not share, disclose or otherwise make it accessible to any third party a customer's electrical consumption data. That sounds pretty reasonable. 

Dave Bittner: Yeah. 

Ben Yelin: But a separate law, the California Public Health Records Act, prohibits public utilities from disclosing consumer data, except upon court order or the request of a law enforcement agency relative to an ongoing investigation. So I think what the city is trying to say is, this is all an ongoing investigation of a - of the problem of people growing marijuana in their own house. 

Dave Bittner: It's like the war on terror. 

Ben Yelin: Yeah. 

Dave Bittner: It never ends (laughter). 

Ben Yelin: It never ends. I love these endless wars because they unlock so many fun powers. 

Dave Bittner: (Laughter) Right, right. 

Ben Yelin: So what this means is, if you live in the city of Sacramento, especially if you are of Asian descent, you're under constant monitoring. And it's really going to chill your activity in your own house. And you might have a reasonable fear that if, for whatever reason, you're consuming a large amount of electricity, that you're going to be under the watchful eye of law enforcement. So that's a major concern. The fact that since about 2009, 2010, we've had these smart meters in use in most places in the country that can really get down to a granular level, whereas in the past, they were - it was kind of an aggregated read of your electricity usage by several hours or by the day. So you couldn't tell if somebody was doing something really acute. And now you can. So I'm just kind of disturbed that this is a practice that's ongoing. I'm curious to see where this lawsuit goes. And it's just a story that really caught my eye. 

Dave Bittner: Do you have any thoughts or insights on the fact that they seem to be targeting Asian people here? That seems odd to me. 

Ben Yelin: It is odd. So there is a large Asian American population in the city of Sacramento. And in the complaint that was filed with the court, they quote a couple of public officials saying, yeah; this is a big problem among Asian households. And they kind of have the data to prove it. They are disproportionately surveilling Asian households. I think somebody said something, and they noted this in the complaint, like, yeah; it's these Asian families again. 

Dave Bittner: Wow. I mean, is there any evidence that the Asian families in this community are disproportionately the ones who are growing? 

Ben Yelin: That is the allegation leveled by certain members of the local Sacramento police. I have no idea to know about the veracity of that data. 

Dave Bittner: Right. 

Ben Yelin: And you still can't racially profile people, even if they are disproportionately likely to commit certain illegal acts. California has statutes on that. And it seems like that's going on. If the targets are in ZIP codes where there are a lot of Asian American households, and you're not collecting from predominantly white households, which - the evidence seems to indicate that that's the case - then it's just kind of rank discrimination. So it's - I think it's no wonder that this Asian American advocacy organization has taken on this lawsuit, along with the Electronic Frontier Foundation. 

Dave Bittner: And what relief are they looking to get here? 

Ben Yelin: So they are just looking for a declaratory judgment. That actually, to me, is favorable to the plaintiffs because sometimes courts - if you - if your prayer for relief is monetary relief - you know, give me hundreds of thousands of dollars for emotional distress - that can seem like a frivolous lawsuit, where you're really just money-grubbing. That's not what's happening here. They are asking for a declaratory judgment, basically telling the city of Sacramento to knock it off, to have the court enjoin the city from collecting - warrantlessly collecting this public utility data. And whatever nominal damages for attorneys fees, etc. - they're requesting that as well. 

Ben Yelin: But really, it's just a declaratory judgment saying, this is wrong. This is illegal. This violates California statute. This violates California's equivalent of the Fourth Amendment. And it's something that just has to be stopped. 

Dave Bittner: Wow. I - one thing that strikes me is, I wonder if there's a case to be made here that if you're up to this sort of thing, maybe solar panels are the way to go. 

Ben Yelin: Yeah, exactly. 

Dave Bittner: (Laughter). Right? 

Ben Yelin: Maybe that's the solution. You know, sometimes with these outdated Fourth Amendment doctrines where it's, well, you don't have a reasonable expectation of privacy in your utility data, the law is so backwards that sometimes the best solution is to outmaneuver the government with technology. So I'll see your smart reader, and I'll raise you solar panels. 

Dave Bittner: Right. Right. 

Ben Yelin: That might be the most equitable solution if you're a family. Now, of course, that creates major equity concerns... 

Dave Bittner: Sure. 

Ben Yelin: ...Cause I don't know if you've noticed, but solar panels, the installation is relatively expensive. 

Dave Bittner: Yes. 

Ben Yelin: My parents keep insisting that I get solar panels on our house, which is great. I'd, ideally, love to do that. Then I get some sticker shock when I see how much it would cost to install them. 

Dave Bittner: Yeah. I know. I've been through the same thing. Yeah. 

Ben Yelin: So it's just not something that every family can do. 

Dave Bittner: Right. Right. All right. Well, that's an interesting case for sure. We'll have to keep an eye on that to see how it plays out. 

Ben Yelin: Yes. 

Dave Bittner: It's a good one. My story this week comes from a couple places. There's an online site called the Hunton Privacy Blog, which is my primary source here. And then Techdirt had an interesting take on this case as well. California recently enacted the California Age-Appropriate Design Code Act, which the governor recently signed. So it is in action now. And the purpose of this act is to, of course, Ben, protect the children. We got... 

Ben Yelin: It's always about protecting the children. 

Dave Bittner: We have to protect the children. 

Ben Yelin: If you just say protect the children in the law, it has to be good. That's what I've heard. Yeah. 

Dave Bittner: That's right. That's right. So what this act intends to do is protect kids by requiring that organizations make their default privacy settings offered by online services, products or features to a high level of privacy, unless the business can demonstrate a compelling reason that a different setting is in the best interests of children. That seems to me a loophole you could, like, drive a truck through. 

Ben Yelin: It sure does. Yep. 

Dave Bittner: (Laughter) Right? Right? They need to concisely and prominently provide privacy information, terms of service policies and community standards using clear language suited to the age of the children likely to access the online service, product or feature - so there goes your EULA. 

Ben Yelin: Yeah, I'm sure my 3-year-old would be... 

Dave Bittner: (Laughter) Right? 

Ben Yelin: ...You know, could very intelligently read the language here that disclaims responsibility. 

Dave Bittner: They have to complete a data protection impact assessment and, upon request, provide that to the California attorney general. They have to estimate the age of children users, or children who are going to be users of their site, within a reasonable level of certainty appropriate to the risks that arise from the business's data management practices. They have to enforce the published terms and policies. They have to provide prominent, accessible and responsive tools to help children to exercise their privacy rights and report concerns. In terms of enforcement, the California attorney general is tasked with enforcing this. Violators may be subject to a penalty of up to $2,500 per affected child for each negligent violation and up to $7,500 per affected child for each intentional violation. 

Dave Bittner: So, OK, this sounds great. I mean, we all want to make kids safer online. The folks over at Techdirt pointed out - one of their - I guess you could call this an opinion piece by Mike Masnick - that we've been through this before. 

Ben Yelin: Yep. 

Dave Bittner: And the Supreme Court kind of swatted this down. And that's where I want you to step in here, Ben, and give us a little insight on what we're talking about here. 

Ben Yelin: Sure. So I think first we have to acknowledge that the problem they're trying to solve here is real. 

Dave Bittner: Yeah. 

Ben Yelin: We don't want children to be subject to questionable data collection practices. They are more vulnerable. And even though they have the same opportunity to read the EULA as we do, they are children. 

Dave Bittner: Right. 

Ben Yelin: So they're impulsive and don't make the same sort of wise decisions that we all make when we click agree to these terms of service. 

Dave Bittner: Right. 

Ben Yelin: So you have to first acknowledge that this is a problem that they're trying to resolve. I think the concern is that this is pretty blatantly unconstitutional, and the Supreme Court has basically said as much. There was a scare in the mid-'90s based on legitimate fears about child exploitation online that we needed to come up with some type of regulation protecting children's data on sites that they are likely to use. So this was a bill that was proposed in the mid-'90s. It turned into something that will probably sound familiar to our listeners - the Communications Decency Act. 

Dave Bittner: Oh. 

Ben Yelin: We only talk about Section 230. Generally, when we talk about that act, there are a lot of other parts of that act intended to protect children online. And part of the legislative scheme was to deny minors access to, quote, "potentially harmful speech." The Supreme Court said that in order to do that, to deny minors access to that speech, the act, the Communications Decency Act, effectively suppresses a large amount of speech that adults have a constitutional right to receive and to address to one another. That burden on adult speech is unacceptable if less restrictive alternatives would at least be as effective in achieving the legitimate purpose that that statute was enacted to serve. 

Ben Yelin: So I think this idea is you're being overbroad in putting a burden on speech if you even just suspect that a minor might view it. In California's defense, they kind of are trying to get around this unfavorable Supreme Court ruling by coming up with a definition of the type of content that they think would appeal to minors. 

Ben Yelin: So they list anything that's directed to children as defined by COPPA, the Children's Online Privacy Protection Act - something that is routinely accessed by a significant number of children, based on whatever competent and reliable evidence they have, advertisements marketed to children substantially similar to or the same as something accessed by children, design elements that are known to be of interest to children, including but not limited to games, cartoons, music and celebrities who appeal to children, a significant amount of the audience is determined based on internal research company to be children. And I think that last one might be a way where they can say yeah; some adults might view this stuff. But really, 90% of people who are watching "Blippi" videos are either parents of young children or the young children themselves. 

Dave Bittner: Right. 

Ben Yelin: I got to say, though, Dave, there are a lot of weird people who do weird things online. 

Dave Bittner: Yeah. 

Ben Yelin: And many of those people are interested in games, cartoons, music and celebrities who appeal to children. 

Dave Bittner: I've been known to watch an episode of "SpongeBob" from time to time. 

Ben Yelin: Yeah. I mean, "Bluey" is a very intelligent show. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: If I want to watch it after my kids fall asleep, when there's nothing else to do, I'm not going to, you know, necessarily deny that opportunity to myself. 

Dave Bittner: Sure. 

Ben Yelin: So I think this will have some sort of suppressive effect on adults who want to view some of this content. And I think the concern in our court system is these types of regulations tend to be overbroad. I'm not sure how you could more narrowly target it to content specifically geared towards children. I think perhaps the courts and legislatures across the country can work together or in an iterative way to really hone in on that definition, so it only covers the amount of material necessary to protect children. 

Ben Yelin: I've noticed just with my own kids and them watching YouTube Kids that YouTube does a pretty good job of curating content for children. I mean, all of the videos available on YouTube Kids, based on the algorithm, seem to me to be quite age-appropriate. And so, you know, there seems to be a way to do this targeting from the companies themselves. So maybe the government can hone in on their targeting policies as well so that these restrictions aren't overbroad. 

Dave Bittner: Why would California do this? I mean, they're aware of the previous Supreme Court precedent, right? They know this is likely to meet resistance from the tech companies and possibly even just get slapped down. Is this performative? I mean, what do you think is going on here? 

Ben Yelin: Some of it is to address a real problem. But yeah, some of it is performative. And sometimes you just want to test the court system. I mean, think about the story we did last week on the Texas law that prevents big tech companies from censoring based on political viewpoints. 

Dave Bittner: Right. 

Ben Yelin: That seemed, based on court precedent, to be blatantly unconstitutional. They passed it anyway. And the appeals court upheld the law. So you never know if you're going to get a favorable judicial opinion. Even if it does eventually make it up to the Supreme Court, you might get a period where the law is enforced. You have a new working majority on the Supreme Court. 

Dave Bittner: Right. So this is - if nothing else, it's an unpredictable Supreme Court at this moment in history, right? 

Ben Yelin: Exactly. So you don't know - I mean, they've sure been known to reverse some of their past decisions. 

Dave Bittner: Yeah. 

Ben Yelin: I can think of a couple of very prominent examples of that. 

Dave Bittner: Right. 

Ben Yelin: So, yeah. I mean, it could be just testing the court system. If you think that this is a societal problem and you are very interested in protecting kids, pass a law. Try to make the type of regulation as narrowly targeted as possible to the content that's actually intended for and viewed only by children. I can't see a way around the problem with the fact that there are just a lot of adults who have hobbies of things that seem to appeal only to children. 

Dave Bittner: Yeah. 

Ben Yelin: I've spent enough time on the internet to know that that's the case. And I've... 

Dave Bittner: Yeah. 

Ben Yelin: There used to be a BronyCon convention in Baltimore City. 

Dave Bittner: Oh, yeah. Yeah, yeah. 

Ben Yelin: So those people exist. 

Dave Bittner: Right. Right. 

Ben Yelin: Yeah. So that seems to me to be the main problem here. 

Dave Bittner: Yeah. All right. Well, we will have links to those stories in our show notes, of course. And we would love to hear from you. If there's a topic you'd like us to consider for the show, you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Bob Maley. He is from an organization called Black Kite. And we are discussing the Gramm-Leach-Bliley Act, the GLBA, which is a federal regulation that requires financial institutions to safeguard sensitive customer information. Here's my conversation with Bob Maley. 

Bob Maley: Well, it's been in place quite a long time. There were some issues - and I don't recall exactly which financial issue it was, but it is - literally, when we talk about it, we talk about GLBA as back in the day. So it's 1999, I believe, is when it was first brought into bear. And there's been a number of updates over the last several decades, but it's been around a while. 

Dave Bittner: And so what are these updates that are going to be affecting some people anew here? 

Bob Maley: Well, the new update's really interesting. You know, this has always been in place for banking. And the banking industry is very adept at - you know, obviously, they need to be adept at - protecting their customers' information. They - the type of information shared with banks is significant. But, you know, if you look at any of the press, what's happening out in the criminal world is that we continue to see releases of data, ransomware and a number of other types of things that happen. 

Bob Maley: So, you know, the Federal Trade Commission has been involved with this for a number of years. They go back, and they look at information security - if you have a breach from information security laxness. And they kind of took a view on that, that that's deceptive practices. And there are a couple cases where they've actually fined companies for that - for deceptive practices. And I think that was a really unique way of trying to look at, how do we globally bring all businesses into some type of, you know, cybersecurity breach notification and those kind of things? 

Bob Maley: So the new updates that they're proposing - it brings it a little bit more into today. There are some technical updates that they're talking about - things like multifactor authentication, secure destruction of data, the use of encryption. And again, that's across the board. And those are things that simply are good practice. Whenever there's a standard, whenever there's a regulation, technology changes. And if they fail to keep up with technology, then it goes out of date. So those components of it, I think - that's a great thing. 

Dave Bittner: And so who do we suspect that this update is going to affect most? 

Bob Maley: Well, it's not really going to affect the banking industry because they're already doing most of the things that are in the new updates. Where it's really going to affect is there's a whole new group of businesses that are not in the banking sector. They're nonbanking companies. But because of their business process, what they do is - they may be involved in some type of enabling a financial product or a service for their customers. And these are ones that you might not think about. When you go to a car dealer, and you want to buy a car, and you want to have the car dealer finance that, well, that act of the car dealer enabling a financial product or service now brings the car dealer under the scope of what they're proposing in GLBA. 

Bob Maley: And it's not just car dealers. It's - it could be real estate appraisal services, collection agencies, check cashing services, tax prep firms. It goes on and on. And it's really going to broaden the scope of small businesses and companies that now really have to pay attention. And it's not really just the small businesses, though. There's another sector that'll be significantly impacted, and that is higher education. How many people go to a university or an institution? And where do they get their student loans from? It's enabled - those universities have departments and divisions to figure out, well, how can we put you in touch with the right lender so you can get your education loan so you can go to school here? So now that involves and includes them in scope of the updates. 

Dave Bittner: You know, whenever we're talking about regulations, you know, one person's protection is another person's undue burden. In terms of feedback to this, you know, what sorts of things are we hearing? 

Bob Maley: Well, we really haven't heard a lot yet, simply because this has kind of been under the radar. It's just getting out into the press. People are just starting to talk about it. But I can imagine that there will be some weeping and gnashing of teeth. And it's hard to be in my position. And - but I do empathize with it. But, you know, we do need to protect that data. The - you know, for whatever business we're in, whatever services we're providing for our customers, we have some type of data from them, whether it's personal information, financial information, and we should be doing those best things that we can in order to protect them. But in reality, and especially in today's economy, most businesses are struggling just to stay afloat. And to now have to either improve and, in a lot of cases, maybe bring in brand-new security programs - it will be very challenging. 

Dave Bittner: Are there any specific things that come to mind here in terms of, you know, the practical things that people will have to put in place in response to this? 

Bob Maley: Well, one of the first things is they're going to have to have some kind of a security program. And it doesn't really specify how complex that security program is. And my guess is it is going to - the requirements are going to be aligned with NIST standards. Most regulations, that's what they do. But when you do that - so your security program involves quite a few things - again - you know, I talked a little bit about it - some of the technical controls. And that's, you know, multifactor authentication and, you know, the use of encryption. 

Bob Maley: And, you know, multifactor authentication is - everybody talks about it. It's part of zero trust. Every new executive order that comes out from the White House, every regulation, every framework - multifactor authentication - first thing that you should do. But in reality, that is something that decreases the ease of transactions. Yes, it makes it safer. But when you want to go log into your bank, and your bank tells you, oh, well, we need to get the code that we're going to send you on your phone - now, for a security-minded person like myself - gives me comfort. I don't mind doing that. But for many people, that increases friction in the transaction. And in the business world, if you add friction, occasionally what will happen is that transaction will be aborted. They'll stop. You'll lose that sale, so to speak. So we try to make everything that we do as frictionless as possible. So there's that balance. 

Bob Maley: So we're going to be required to do MFA. And how do we do that in the least amount of friction as possible? And that's just one thing. There's another thing that - every one of these businesses is going to have to appoint somebody, essentially - they don't name it as a chief information officer or a chief security officer. What they do say is that there has to be someone in that program who is accountable for that program. They have to do a risk assessment of their entire organization's information security on a regular basis, and they have to report that to the board. So that's the components of a security program that - my guess is - a lot of these new organizations are going to be subject to. That's going to - deer-in-the-headlights look. That's not their core business. That's not what they do. So that'll be challenging. 

Dave Bittner: Yeah. I mean, it is fascinating, isn't it? Now, what options does the FTC have here in terms of, you know, carrot versus stick and how they can come at the organizations who are going to fall under this? 

Bob Maley: Well, obviously, the regulation is the stick. And I don't think there's really, so much to speak, a carrot. But from what I understand, as far as enforcement goes, it's not going to be the same as it is in the banking industry. So in the banking industry, you have regulatory bodies that you have to report to. They come in, and they audit you. They look at your security program. So there is enforcement in that regulated world. But even though this is a new regulation, the FTC will be the ones enforcing it. So my guess is what will happen is that stick will happen if you are subject to a breach. So if you have a breach and you haven't been following these protocols, at that point, that's when you're going to get hit with a stick. 

Dave Bittner: I see. So it really may be more reactive than proactive in terms of how the FTC approaches this for these particular types of organizations. 

Bob Maley: Yeah. And I get that. How would you go about bringing - and we don't have numbers on this, but I assume it's in the hundreds of thousands of companies now that are going to be subject to this. How would you manage that? How would you audit that? So logistically, I just don't see that happening. 

Dave Bittner: Yeah. What are your recommendations, then? You know, if an organization thinks that they're going to fall under this, what sort of things should they be doing and putting in place? 

Bob Maley: It's not just these organizations. I know I've been speaking a lot about - the problem I see with best practices today, that - you know, we have this standard set of best practices that are outlined in things like NIST and things that we should be doing. And obviously, this is a new set of best practices that are for a much wider audience. So, you know, we look at the history of these best practices and how they've actually impacted cybercrime. 

Bob Maley: And there was a recent study published in Cybercrime Magazine about the cost of ransomware that they say over the last six years, the cost of ransomware has gone up 57 times. So to me, best practices the way they do them today aren't really working, and that's because they're so complex. If you're a brand-new organization that's now being covered by this and you look at all these things that we just talked about, all these technical controls - that we have to have a security program. We have to have incident response. We - yeah, it's like, what do you do? Well, you have to do something. So you start out with basics. And one of the basics that we try to teach people is that you have to think like the bad actors. And how are the bad actors going after organizations? 

Bob Maley: And there are certain technical things that bad actors - it's called tactics, techniques and procedures that they use, that are - researchers know what they are - that they target specific things. And you need to understand what those specific things are and address them first as part of your security program to ensure that you're not that company that's going to be under the scope of FTC after a breach. Prevent the breach - the best thing you can do. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: Yeah. I mean, Gramm-Leach-Bliley has been a tool in the arsenal of regulators for more than 25 years now. But I think, as we are evolving through the digital age, depending on the presidential administration, it's a very effective way to regulate online behavior because even though it's intended to regulate financial institutions, a lot of our financial transactions take place online. 

Dave Bittner: Right. 

Ben Yelin: So it's just a very effective tool in the toolbox. And it looks like they've taken pretty robust action in strengthening the reach of this law... 

Dave Bittner: Yeah. 

Ben Yelin: ...Through these regulations. 

Dave Bittner: Yeah. It's fascinating. All right. Well, again, our thanks to Bob Maley from Black Kite for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.