Threat Vector 6.6.24
Ep 24 | 6.6.24

Beyond Compliance: Using Technology to Empower Privacy and Security with Daniel Rosenzweig

Transcript

David Moulton: Welcome to "Threat Vector" the Palo Alto Networks podcast. Join us as we navigate pressing cybersecurity threats, discuss robust protection strategies, and uncover the latest industry trends. I'm your host, David Moulton, director of Thought Leadership for Unit 42. [ Music ] In this episode, we're diving into two compelling topics. First, how a strong legal and technical partnership can help businesses navigate the complex world of data privacy and cybersecurity laws. And second, we'll explore the impact of emerging AI technologies on these legal frameworks and how businesses can adapt. My guest today is Dan Rosenzweig, the founder of a boutique data privacy law firm that specializes in all the things you'd expect from a typical data privacy shop. They do everything from privacy impact assessments, risk assessments, to assisting their clients to operationalize their legal requirements. Dan summed up the practice for me as the translation layer from legal to tech and vice versa. The information provided on this podcast is not intended to constitute legal advice. All information presented is for general informational purposes only. The information contained may not constitute the most updated legal or interpretive compliance guidance. Contact your own attorney to obtain advice with respect to any particular legal matter. Let's get right into our conversation. Dan, when you and I talked before off mic, you talked about how you took the, you know, the time to learn something else and blended together your background in law and interest in and a talent for front end development. And it's kind of a peanut butter and chocolate space for you. And it's impressive that you were able to find such a silver lining in what could have been and probably was a tough time.

Dan Rosenzweig: Yeah.

David Moulton: But, you know, to have something beautiful grow out of that is amazing. So, you know, good on you for finding code and deciding to dig into it while you were recovering.

Dan Rosenzweig: Yeah. And I really- I've developed a passion for all things tech, right? I mean, I think- you know, there's a couple of things and I was actually speaking a couple days ago at an event, and it really, in my view, comes down to pretty much a couple of things that most folks, particularly lawyers, don't really follow. And I think the first one is, and it's super corny and cliche, but it's corny and cliche for a reason. I think it's very important to find something that makes you happy and let that drive your desires and really get you to where you want to go. Because work is a big part of what we do on a day-to-day basis, and finding a passion and happiness is really, really important. And I think a lot of people lose sight of that, and that becomes problematic for their own lives and the people they love and those around them. And I think if you find that passion and you're happy, you can really, you know, grow your career, even start your own shop, whatever it is that you want to do.

David Moulton: Your firm presents itself as the bridge between law and technology. And I think maybe I'll just start, like, how does that work?

Dan Rosenzweig: To level set. So the data privacy and cyberspace is evolving drastically, literally, on a daily basis, whether it's new laws being introduced, new regulations being introduced, not to mention the technology. We have AI, we have, you know, the upcoming deprecation of third party cookies and just things are evolving and changing on a daily basis. And a lot of these laws, particularly laws like GDPR, CCPA and others that are now following suit, were enacted as a response. So, because it's a response to tech, it's important to understand the underlying tech that's driving that. So for example, there's, you know, in the US, particularly under the CCPA, there is the notion of giving consumers the ability to opt out of sales and shares of personal information i.e. targeted advertising. And it's one thing for the law to say that, right? But it's a whole nother thing to actually make that happen, technically speaking. So if you're going to give a consumer the ability to unsubscribe or give them the ability to opt out of targeted advertising, which is typically done through a toggle or a footer on the bottom of a webpage next to their privacy policy, things of that nature, then you need to honor it, right? Long gone are the days where companies can feel like they've accomplished their data privacy and even their cybersecurity compliance by virtue of drafting policies. Right now the policies are incredibly important. I'm not at all belittling them, but you have to actually action those policies. And the folks that are actioning them are the developers, are the product, are those folks.

David Moulton: Dan, is that what really drove you to have this sort of epiphany that an attorney needed to have an understanding of the technical side, or somebody on the technical side had to understand the intent, the spirit of the law, and why it was important to marry those two things together? Is that essentially what drives you?

Dan Rosenzweig: I think, yeah. I think that's actually well said. I think that's a big portion of this, and I think, unfortunately legal, generally, is viewed as an impediment or the bad guy. And oh, we don't want to get legal involved. And I think we've come a long way, at least in this space, on the data privacy and cyber spot where legal can be a really powerful partner, right? Because ultimately, consumers are expecting this, right? They care about their privacy, their security, right? We hear about breaches all day, every day, and I think consumers are now a lot more aware of what's happening and as they should be, right? I think it's a great thing that consumers are becoming more attuned to what's happening with their data, but it's on the company as the, you know, the one operating and owning responsibility with respect to their consumer data.

David Moulton: Right. Can you give me an example where that legal expertise has really come in, significantly influenced a tech company's policy, whether it's privacy policy or, you know, some other piece and you're talking about legal being that powerful partner. What does that look like?

Dan Rosenzweig: So, to bore people for a quick history lesson there is a law called the Video Privacy Protection Act, the VPPA, and it was enacted in the late 80s and obviously way before, you know, we are where we are now with the internet and ad tech and all these other, you know, technologies. That law has really come up and it's alive. It's alive and well, right? There are many, many companies, are being sued for alleged violation of the law. And essentially, to boil it down, what the law prohibits is the disclosure of personal information in conjunction with the video information to a third party. So, essentially, for, unless an exception applies, but that's generally the rule, right? So are you disclosing the who, which is the consumer, and the what, which is the video to the third party, right? So ultimately that's becoming an alleged violation of this law. So that's what the law says. But what does that mean from a technology perspective? So often is the case that, particularly in the media space and publishers, and they are the ones hosting videos and the ones getting, you know, dinged for the alleged violation of this law. So on the technical side, we have two options. We can either, you know, remove the pixel or the technology that is resulting in the transmission of that who and the what, the personal information plus the video information to a social media company or another third party that isn't the business. Or there are some technical options. Now, option 1, with respect to, you know, suppression or removing that pixel or removing that technology, that's all well and good, right? That's something that can legally be compliant. But if you tell a publisher or a media company, or you, David in the marketing side, hey, we've got to remove a bunch of these pixels, you're going to respond with, well, we have a legitimate business need for using these technologies. We can't just remove them. You know, that's going to be a huge impediment to our business. So that got me thinking, well, there has to be some technical nuance here that we can mitigate legal exposure by virtue of using available technology. So what I've been able to work with on clients, and a lot of my clients are multimedia companies and big publishers, is, okay, well, we've decided that we need to continue using this technology. We can't do it in a way that, you know, violates the law. Although we can comply with the law by removing it the technology that's going to impact our ad revenue or impact our business.

David Moulton: Sure.

Dan Rosenzweig: So what do we do technically? So I work with them on creating operational controls, and I'll work directly with product or the developers to configure the technology in a way that will mitigate that transmission of the who and the what to the third party.

David Moulton: Dan, if I replay this, there's a tracking pixel on a website. Somebody's come along and said, please don't track me. I don't want to be advertised to, I don't want to be a part of, you know, whatever that collection is. And the two blunt objects here violate the law or take the pixel out, which then shuts down any valid use for folks that are not opting out. And what you're saying is that there's this third option, which is configuration, and looking at that pixel and saying, okay, for those that have raised their hand and said, no, thanks you can honor that, stay compliant. But then all the other uses of that tracking pixel that don't violate the law, don't necessarily violate a customer's preferences are still in play.

Dan Rosenzweig: I would say that's very well said. The only nuance from the legal side is that in this particular law, it's actually not opt out. So the consumer isn't opting out. It's actually, if you want to do this, you technically need to gain or obtain a very granular, specific level of consent, which is very difficult to do under this particular law. So rather than having to go and obtain consent, we're doing exactly what you just alluded to, which is, we're using a scalpel, right? We're going to configure the technology in a way that will mitigate our exposure and enable a better compliance mechanism.

David Moulton: So, still legal, and forgive me, this is new for me on what the nuance is.

Dan Rosenzweig: No problem.

David Moulton: And I think this is why that understanding of the legal rule and of the technology is important to have marrying those two things together. I'd call that the peanut butter and chocolate set of skills. So I'll show my, maybe ignorance here, but, you know, I'd rather ask the question now than stay dumb forever. Doesn't section 203 protect multimedia companies, publishing companies from, you know, lawsuit? Or is that the content that comes from the social media user and it doesn't protect the advertiser, it doesn't protect the website owner? Is that the split here?

Dan Rosenzweig: Yeah. So section 230 is actually completely separate from all of this. That essentially, at a high level, provides protection to the website host or the publisher with respect to comments or other type of exchange that is happening on their site with content they're not creating. So if someone goes on to Facebook or another social media platform, or they're making comments on a news site or a news aggregate that the publisher or the company that's responsible for that business is not necessarily going to be deemed liable for the comments and content that's being created by its users at a high level of what it is. Whereas these laws are very different in the sense that something like the VPPA, the Video Privacy Protection Act, are different in that the onus is on the business. Because in theory, the business is the one that introduced the pixel that is sharing information outside of the business's website, right? So it's not that the data's being shared directly with the first party in this instance, the business, it's being shared with a third party, which is outside of the intent of what is supposed to be happening with that particular law. So there's definitely- and you're spot on, right? There's a ton of nuance here, and I think that actually is a good way to segue into the next point, which is, the US typically takes a sectoral approach with data privacy and cyber, right? It's what can we do for financial laws, or what can we do with health or children's or from a federal level, right? That's typically the approach that's been made. And laws like you know, GLBA or HIPAA or even section 230, which is not a really a privacy law, but still it's sectoral in that it has a specific focus. It's not a broad law. Compare that to the California Consumer Privacy Act, which is a state law or the Virginia or Connecticut, or Colorado, right? Those are what we call comprehensive US state privacy laws that also have security provisions in them as well that are not sectoral. They don't necessarily apply to one industry. It applies across the board, assuming you hit the jurisdictional thresholds.

David Moulton: How has your understanding of or interpretation of privacy or cybersecurity laws changed because of your understanding of web development and coding?

Dan Rosenzweig: So I think that's actually a really important question somewhat similar to what we had said in the beginning, which is the laws have been passed or enacted as a result of the status quo of technology, right? For better or for worse. And ultimately, in my view, I think it's really, really important to have that background and understanding and how the technology operates in order to provide some solid legal advice. So for me, when I see a law focused on opting out, right? Like the California Consumer Privacy Act, okay, well, that's what the law says, but what does that mean? GDPR, right? Or ePrivacy Directive, depending on, you know, what regime we're talking about in certain circumstances requires opt in. Okay, well, technically speaking, those are very different things opt in and opt out, right? So yes, it's easy to put that in language and put that in the law and make that a requirement, but from a technical perspective, particularly multinational corporations that have to comply with both sets of laws, right? Those are inherently conflicting, right? Like, how do we comply with both? And it's obviously possible, but you need the technology in order to do it. So legal, if they're not able to kind of coordinate with product or with developers I think that's incredibly difficult to navigate the complexities of what those different laws require, right? It's a completely different approach. So for me, learning the backend, you know, technology and how that operates has really enabled me to work with developers and legal and act, again, not only as a translation layer, but as a positive way to move forward as opposed to an impediment.

David Moulton: So, Dan, I've got a question. I know that in the US it's opt in and I want to say it's in --

Dan Rosenzweig: Opt out.

David Moulton: -- in Finland, it's opt out, but I'm talking about something completely different here. Organ donating, right? Like, if you look at your license, you have to opt in here in the US and other countries, you've got to opt out and you get essentially the same results mirrored. And I would wonder, when you look at like a privacy law or one of these pieces of legislations that you have to comply with, do you see a consumer behavior that follows those same patterns? If the law is written to say that you have to opt out, do most people just ignore it? Or if you have to opt in, do they mostly ignore it and not opt in or is it really skewed in one direction?

Dan Rosenzweig: Anecdotally, I think it really comes down to, a large portion of it comes down to two things. One, who's the brand, right? Who's the business that's asking you to either opt in or giving you the ability to opt out? Is this a brand that you as a consumer interact with all the time and you trust and you love, and it's someone that you value particularly on the media side, right? Publishers, they, a large portion, if not the majority of their portion of revenue is from advertising, right? So if the consumer understands that, and I think this naturally goes to the second point that I was just alluding to, which is if the business does a good job at understanding or describing the value to the consumer, and it's a consumer that uses the brand, and there's that mutual trust between one another, I think the consumer is much more likely or going to be more comfortable with in the US parlance not opting out. Whereas in the EU, maybe not exercise or opting in is the better approach, right?

David Moulton: Mm-hmm.

Dan Rosenzweig: That I think is an important part. What is the relationship directly between the consumer and the business? And if the consumer has an understanding of what the business does and it's a brand they value, then I think that has a larger impact. But at a more macro level, we actually, and I haven't obviously looked, I haven't conducted any of studies on my own, but I've conducted research on this of a high level, nothing like that. But from the perspective of other folks that have done this research, another wrinkle to all of this, to the data privacy regime, we've talked about laws, we've talked about tech, is also the platforms themselves, right? So a couple of years ago, and I'm sure you, David, as a consumer have seen this, Apple has introduced what's called the app tracking transparency framework. And really what that means is it's when you open up an app, a app developer or an app is required under their guidelines to present you with the ability to opt into what Apple QOL defines as tracking, which essentially is another way of saying, you know, cross contextual behavioral advertising, i.e., targeted advertising across different devices and business units. So what happened is a lot of iOS users didn't opt in, right? So this went from iOS overnight, essentially went from what is an opt-out regime to an opt-in regime. And consumers were not, and to this day, consumers are not opting in the same level that they were or not opt- the opt-in and opt-out are not parallel. They're not the same.

David Moulton: I want to come back to this idea of jurisdictional privacy laws. You were talking about that a little bit earlier, and how does that actually affect companies? You know, you've got a big law from California, but if you've got a smaller, more regional law, or a smaller state or an international law that's not as strong, what does it look like to deal with those conflicting jurisdictions and sometimes conflicting guidance?

Dan Rosenzweig: Yeah, no, and it's a difficult road to navigate for a lot of companies, rightfully. Which is actually why, you know, a lot of companies are pushing for a federal privacy law, right? So that they have a baseline that they can work with. Because right now, there are a couple of approaches, because you're right, not all states are the same. And some of them take different approaches to different things. Some companies have the resources to do that. They, you know, they'll actually- I'll work with them and their developers to implement, again, technology, to be able to distinguish a state user in one state to another, and then create different mechanisms to really comply with their conflicting legal advice or different legal advice. Others will say, listen, I don't really want to do that or I don't have the resources. Instead I want to use the strictest approach. So then they'll apply with what is the law that apply. Well, again, looking back at the laws that apply with that to them, will then figure out which laws has the strictest guidelines or the strictest baseline, and use that as kind of the map. But the problem with that sometimes is, although it's legally compliant to an extent, it ultimately require or leads to some over compliance. So there's a couple of items that I've come across in my experience that can lead to some issues for companies in that regard. One, if you're a publisher or a media company and you are requiring, you know, you're offering certain mechanisms in place to states that you don't have to, while I always suggest it's a good idea to do that to the best of your ability because it's privacy friendly, and that's important, right? Consumers care about their privacy. But if you're going to hold yourself out as complying with certain laws, but then you can't, or you're not actually doing it, then that's a problem, right? So some companies will say, yeah, I'm going to give these rights to every consumer, and that's great. I actually, as a consumer myself, from a privacy friendly perspective, I think that's a good idea if they want --

David Moulton: For sure.

Dan Rosenzweig: -- to do that. But if they're not going to actually honor those statements, now that's a problem. So now you've decided that, as you know, for resource constraints, you're going to apply one law to every state, but then you're not actually honoring those statements that you've now availed yourself of. Then that's a problem. Then you have- not only are you allegedly violating the law that you now otherwise availed yourself to, you're also availing yourself to unfair and deceptive acts and other types of problems. So there is- it's an important nuance to navigate

David Moulton: Dan, in a world where VPNs are a thing, how do you get that location data correct and make sure that you're actually, you know, applying a law for somebody who's, you know, rolling in, say, from Texas, but looks like they're coming in from Colorado to a company that's headquartered in California. What does that look like for a company?

Dan Rosenzweig: Yeah.

David Moulton: How do you navigate that?

Dan Rosenzweig: So at a high level, it's really comes down to, in my view, and a lot of, you know, some laws will speak to this explicitly, well, not to your question, but what I'm about to say, which is, what does the business reasonably know, right? So for example, if the business reasonably sees that you're located in California by virtue of your IP lookup and your IP address, or the state, you know, that they're using, then they're able to treat that consumer as if they were in California and have their website, California version of it, or the configurations we just discussed to be presented to that user. However --

David Moulton: Okay, so the best data that's available --

Dan Rosenzweig: Exactly.

David Moulton: -- that seems reasonable? Yeah.

Dan Rosenzweig: Exactly. And then the next step is, well, now let's say you know that the consumer's logged in, right? So let's say you actually have their account, and as part of their account, they're required to provide what state they live in, or things that would provide, again, more information to the business that would reasonably alert them of where that consumer is located. Well, then you have the ability to then use that as your baseline, right? So the more information you have at your disposal will give you that reasonableness to then be able to say, okay, where is this person located or what state should apply to them? But that's I think a reasonable view of and a reasonable approach on how to navigate that complexity.

David Moulton: So what are some of the common misperceptions that developers have about privacy? And then how can an attorney or a legal expert like yourself educate them so that they're able to do their job and not necessarily fall into those areas of all these laws apply, or nobody will know, or it's impossible for us to look at a VPN and really know where you're at. So let's throw up our hands and just, you know, ignore it. What are some of those misconceptions?

Dan Rosenzweig: I would say a common one, and this has been for a couple years now is the broad definition of personal information, right? So a lot of laws will just say reasonably identifiable to a user. Some laws will even go as far as, say, reasonably identifiable to a device, right? So that actually brings in not just traditional identifiers, like first name, last name, email address but also technical identifiers or device identifiers, cookies, you know, MAC address, Wi-Fi, BSS IDs you know, the IDFA, which is the mobile ad ID, and device IDs and everything far and in between. And not only that, permutations of that, right? Is it, you know, is it hashed, is it encoded? Are there other ways to navigate this? And essentially, if it's consumer device information going in and it's reasonably identifiable to a consumer or device, again, depending on the law then it's arguably in, right? It's arguably qualifies as personal information. And often I'll be on a call with, you know, with product or developers, like, oh, no, we don't handle any personal information, you know, and then I say, okay, well, what about, you know, IP address or cookies or device IDs? They're like, oh yeah, of course we use that, we use that. And I'm like, well, that is personal information under the law, and you're actually triggering it depending on, again, that jurisdictional threshold we were alluding to earlier.

David Moulton: Dan, looking forward, what will impact privacy and cybersecurity challenges more technology or policy changes?

Dan Rosenzweig: Yeah, so I think it's both. And I think if folks are willing to accept that, then that's going to be really, really powerful for them to really manage their compliance and really being able to adopt whatever measures they need to do in order to move forward. And I think often folks want to consider these things very different. You know, product deals with tech and legal deals with law, and ultimately, as we've discussed, right, they impact each other so drastically. While I don't expect or even encourage lawyers to learn how to code, right? That was a matter of circumstance for me and a matter of something I found passionate about. But I think having a high level understanding about how technology operates will be incredibly impactful to their business and vice versa for the technology team, right? It's important for them to be able to be aware of the broad definition of personal information or what the CCPA says as opposed to what the GDPR says, right? Or the deprecation of third party cookies, right? How is that going to impact, which is a technology, right? How is that going to impact privacy or the data privacy laws and compliance mechanisms? So I think both folks coming together for a kumbaya moment, right? I think it's going to be incredibly powerful and enable them to really get the most out of their business and work under a trustful regime with their consumers. And at the end of the day, that's better for everyone. [ Music ]

David Moulton: Are there any regulations that developers or lawyers should be particularly aware of as, you know, these technology and policy changes are coming, that are going to impact that working relationship?

Dan Rosenzweig: Yeah, I think just keeping in mind that this space, particularly on the legal side, as- of course, as well as the technical side, but the legal side in particular is just changing literally every day. I mean, in the last two weeks, several new laws were introduced in various different states. A federal law was introduced, right? So it's just constantly evolving, and I think being able to be aware of that, right? From the developer side, check in with your, you know, your privacy lawyer and your in-house cybersecurity lawyer and say, hey, you know, what's going on here? What are the trends? What are things that we should be aware of? And I think having, you know, maybe a quarterly cadence or a monthly cadence depending on the risk, you know, posture of the company between legal, product, tech, and even marketing. We didn't even really get into marketing. Marketing's a huge player in this, in this space, because think about it, the marketing team is the one actually boots on the ground working with these pixels. They're the ones either themselves or working with agencies introducing these technologies to the site or the app, right? If they're then responsible to an extent. So they need to be educated on this, and I think that's important, right? And then that's great if you have, you know, your marketing team pushing back on the vendors and say, hey, you're saying you're privacy compliant, what does that mean, right? Your legal team is then going to be very happy for you to do that, right? Acting as a champion, but it'll also make your life a hell of a lot easier, right? I can't tell you how many times, right? I work with a major retailer, and during the holiday season, right, they have a bunch of new campaigns that they're introducing, the marketing team and the business side, they're like, hey, we've got to market this, we've got to advertise this, which is great. That's their business. But ultimately what happens is they'll introduce these technologies to no fault of their own. Again, they're doing what they need to do from their business perspective and their objectives. They don't necessarily know as process to speak with legal or speak with tech again as a partner, not as an impediment. And what happens is they introduce these technologies, they are now a bunch of unintended consequences are happening with respect to data on the app or website. And guess what? We now have to go during at the 11th hour, go and fix that. And that, at a time where you don't want to be doing that. You want to be driving traffic to your site and driving the product. So having marketing, dev, product, legal, all of you, all of everyone together truly. Like I said earlier, like a kumbaya moment together and have that frequent ongoing discussion and be each other's champions, I think will go a really long way. And really the way to do that is legal, tech, marketing, everyone working together, a translation between each other and being excited to work together to be a champion in this space. And I think that will not only save- not only save time and resources for the company, but I think will also bring about positive reinforcement for the company, because ultimately consumers, as we've been discussing, care about privacy. So I think it's a positive thing for everyone.

David Moulton: One of the other things that I'm curious to get into is this impact of AI, but we'll start with, you know, what are the legal considerations that should be thought about when you're integrating software development processes that enhance both privacy and security?

Dan Rosenzweig: Yeah, I think it goes back to what we were saying earlier, which is, for lack of a better word, and we didn't even use this terminology, but it's essentially privacy and security by design, right? You want to make sure that you have those processes in place that are actionable so that you know when an- when a new tool or a new technology comes your way, you can at least rely on an existing process or policy and then know how to action that so you can implement it in a way that is privacy, you know, forward and just as importantly, security.

David Moulton: Yeah. So kind of like you were saying, don't introduce a new technology that's not been vetted and tested during the holiday season both from a security standpoint, you know, scalability standpoint, or from a legal standpoint. You need to run through all of those considerations before it hits production, it goes for a full deployment. Now, where I'm really curious to go is how does AI upset the Apple QOL, accelerate things, scale things, make everything perfect and better and we're all done with work. What are your thoughts there?

Dan Rosenzweig: Yeah, so listen, I think it's like anything else, it's a new awesome tool, and that means it's a tool. I think it's here to stay. I think that, but that doesn't mean people can't abuse the tool, right? It's like anything else. It's a tool. So I think it means that we have to have policies and processes in place. We have to make sure that we're not, you know, in our- if we're talking about generative AI here in our prompts, we're not divulging a bunch of trade secrets and consumer information, right? There are ways to use the tool not withstanding laws that say otherwise, right? This is still a very evolving space, but I think use your common sense, right? I think use the tool in a way that will enhance your services, will help your consumers, help your business. but don't do it in a way that is either intentionally, you know, malicious or can have a negative impact on yourself, your business or your consumer. But at the end of the day, I'm a big proponent of it, and I think it's a powerful tool that can really enhance existing services.

David Moulton: Recently we heard from Noelle Russell, and she was talking about an AI red team that would look at your tools and your usage and how you're either putting data in or the types of answers that come out and to be thoughtful about those AI tools. It seems to me that sometimes a benign use can actually have a harmful output. And, you know, with the generative side, you're not always sure what you're going to get consistently over time. And the speed of AI right now is just, I mean, it's blistering. So if, you know, if nothing else, I think this will be an interesting conversation for you and I to come back in and we'll say six months, which might feel like six years with this technology in this space, and have another conversation on it. So Dan, as we wrap up today what's the most important thing for a listener to take away from this conversation?

Dan Rosenzweig: Legal should not be viewed as an impediment. I think they can be a champion with you. And I think working between marketing, development and technology and legal all together, I think can be an incredibly powerful, powerful thing.

David Moulton: Dan, thanks for being on "Threat Vector" today. I really appreciated the depth and thought that you put into all of my questions. I'm sure the audience learned just as much as I did.

Dan Rosenzweig: Thanks for having me. Really appreciate it. [ Music ]

David Moulton: Today we explored the importance of blending legal and technical expertise in managing data privacy and cybersecurity compliance. Dan's main lesson is clear, companies must view legal and technical teams as collaborative partners rather than impediments. By recognizing the nuances of privacy laws like GDPR or CCPA, legal experts can better advise developers on configuring their technologies to ensure compliance. Taking a proactive approach involves aligning policies and actionable processes to build trust and accountability. This cross team collaboration enhances user trust and can even unlock new business opportunities as consumers increasingly value privacy. That's it for "Threat Vector" this week. I want to thank the "Threat Vector" team. Michael Heller is our executive producer. Our content team includes Sheila Durowski, Daniel Wilkins, and Danny Melrad. I edit the show and Elliot Peltzman mixes the audio. We'll be back in two weeks. Until then, stay secure, stay vigilant. Goodbye for now. [ Music ]