Afternoon Cyber Tea with Ann Johnson 9.2.25
Ep 113 | 9.2.25

From Bottlenecks to Breakthroughs: Aligning Legal and Security Teams

Transcript

Ann Johnson: Welcome to "Afternoon Cyber Tea" where we explore the intersection of innovation and cybersecurity. I'm your host Ann Johnson. From the front lines of digital defense to groundbreaking advancements shaping our digital future we will bring you the latest insights, expert interviews, and captivating stories to stay one step ahead. Today Erez Liebermann, partner at Debevoise & Plimpton, joins me to talk about the intersection of security and law and how cyber and legal teams can become true allies in the face of evolving threats. Erez has an extensive background in cybersecurity and incident response. He currently advises businesses on a wide range of regulatory and litigation matters and he is widely acknowledged as a leading cybersecurity and data privacy professional. Some of you may have heard my story that when I was young I actually was on my way to law school. I ended up pivoting to tech and cyber. So I'm always very passionate about the intersection of law and cyber. With that, I've been looking forward for a while to this conversation. Erez, welcome to "Afternoon Cyber Tea."

Erez Liebermann: Ann, thank you very, very much for having me. I'm delighted to join and looking forward to the conversation as well.

Ann Johnson: So you have like decades of experience in the legal world. You've led and investigated matters for organizations and you served in the U.S attorney's office for the district of my home state of New Jersey. Can you share what an ideal partnership between security and legal teams looks like in today's landscape?

Erez Liebermann: Yeah. And I thank you for making me feel old with the decades of experience. It is true. Yeah. Yeah. And I appreciate it. That's what gives us the experience. Right? I think that if you look at partnership between legal and the security team it's truly the word partner that makes all the difference. When I first went in house the first thing I wanted to do was to show the value that a lawyer can bring, and it's really not solely in the capacity of legal advice. As you know so well, cybersecurity is much more interdisciplinary than just Os and ones, than just technology. It really brings legal compliance, obviously cybersecurity engineering, in to play, the technology aspects, risk, and the business sides. And we expect CISOs to think through all those issues and we should expect lawyers to be partners on all of those issues and not to be in a narrow band. I think of that partnership as working together across the issues. When I think of my role in house I'm very happy that I was able to sit, literally go and sit, with the information security team. Sit with the SOC and so you hear about all the incidents. Sit in the staff meetings of the chief information security officer and work through and listen to the issues that they had not just about legal, but overall so that I could understand a security posture. And that way we really had a true team feeling. We also had our chief IT risk very closely aligned and sometimes we were dubbed the three amigos between myself, the CISO, and the chief of IT risk because we worked so closely together and felt that it was all of the functions, the risk, the legal, the -- and the security functions were all of ours together, not just any one of us in any lane.

Ann Johnson: You know, I'm really fortunate to have been at Microsoft now for nine and a half years and working with folks in Brad Smith's organization organization like Tom Burt and Amy Hogan-Burney and Cristin Goodwin who all had that same -- they had that same collaborative approach to be a true partner, not just someone who's sitting on the side interjecting occasionally and giving you legal advice, but actually working through everything with you, rolling up their sleeves, getting -- you know, getting their hands dirty and saying, "Okay. Let me understand the complexity of this issue and let's talk about the business side and the legal side and the regulatory side." So I hear you. And thank you, by the way. Thank you for being that kind of lawyer. Thank you for being a partner is what I would say because having those partners for someone like me who isn't a lawyer, right, having those partners and people that think about problems differently is really useful. When you and I participated on the RSA panel a few months ago, it's been a few months, we talked about incident response which is an area I'm passionate about. I know you've seen a lot of it in your career. Can you share just without like breaking any confidentiality of course but can you share a real world example of where collaboration between legal and the security teams made a situation better or maybe a situation even worse?

Erez Liebermann: Let me start unfortunately with where it makes it worse. I think that sometimes lawyers fall in to the trap of wanting to protect everything with attorney client privilege, understanding or being in the middle of every fact, and being on every single conversation. And while that comes from a wonderful place of wanting to participate and to protect the company, it creates bottlenecks and incident response and brings people in to the fold that don't always need to be in the fold, although I am a big believer in legal being in the fold from an early stage on incidents. So we can talk about that, but I think that's where we have problems with legal and incident response, with trying to over privilege everything and create these bottlenecks. We certainly see it on sometimes when we see incident response teams being brought in to the fold. We see it on communications. And so we try to avoid that and I think that we need to balance our legal risks and the information security and business risks and not let our legal risks always trump. So that is one thing we're always measuring. What's more important here and what's bigger, the legal risk that might come from a certain action or inaction versus what happens if the legal risk trumps and we're not doing what we need to do fast enough from the security or business side. On the other hand where I think legal does help is depending on the background of some of those lawyers. I like to bring my background. First I'm an aerospace engineer. So I have a tech background as well. I have a prosecutor background. I have very, very close relationships with law enforcement, DOJ, FBI, also with regulators. I work very closely with regulators, not just adversarially, but also working with them on how do we all do the right thing as an industry. I think legal can bring those connections to an incident, but of course a team like yours at Microsoft has many of those connections, but most companies don't. And so that's one aspect of helping and enriching the incident response process. The other of course is thinking through some of the pitfalls that come with incident response, and those pitfalls can be what happens if we're paying a sanctioned actor, if we're going to ultimately make a ransom payment. What happens right now with the DPRK IT issues? Do we keep paying somebody once we discover it's a DPRK IT worker? The answer of course is no. Once you discover that you can't. But thinking about that because there are other concerns. How do we think now about the 8K disclosure rule and how do we escalate that? And having legal involved in these types of issues, some of which are your more traditional legal issues and others like the relationships are maybe less traditional quote unquote legal issues I think is how that partnership works so well and how legal can certainly be value add and a critical part of the team.

Ann Johnson: You know, one of the things I really enjoy about my current legal partner at Microsoft is because I work, as you probably know -- my day job is to work communications for our Microsoft incidents. So sitting in the office of CISO. So I often have to write things or say things whether it's to customers or the media or blogs, etcetera, and I bring my lawyer in to review not just because I want his legal opinion, but he's a great writer and he will often say to me, "Hey, this is a business decision, but I would say it a little bit differently." Right? And I appreciate that. I just appreciate having him by my side as, I will say this, as a sanity check. Right? Even if it's not a legal thing he's just such a great partner. So I think what you said reinforces that. And I think the message I'm trying to deliver to our audience is don't be afraid to engage your legal counsel. Bring them in. Not only are they there to help protect you. They're actually pretty good at giving you advice even if it's not legal advice. That's how I would -- that's how I would say that.

Erez Liebermann: Absolutely. Look. It really pained me this last week. I was talking to a client, a very sophisticated client, and the client said, "Look. We -- " This was a CISO. "We don't want to call our legal team when we have an incident." And I said, "Well, why not?" He said, "Well, they immediately believe we need to notify a regulator even when we don't believe we do. It's early in the investigation. And they say that's a legal call and we lose control of the investigation. And so we are thinking about waiting before we escalate issues to legal because it gets so lost and escalated and just fires when we don't need them." And so I said, "Well, that's -- it's sad." In terms of we're talking about this great partnership. They need to work on their partnership obviously. Legal should not be making a decision too early, and certainly on their own, about escalating it to regulators and making notifications. They should be providing advice and then thinking together with the CISO and with the business about how to respond to an incident. They can make that recommendation, but they shouldn't be the ultimate arbiter. There should be a business call here because it's ultimately about business risk. And I think that's the right way to work it together. So I'm trying to get that client to say, "Okay. We are going to bring legal in early." Legal needs to learn not to act the way they do because if legal does that they will get it brought in to the fold earlier. They'll be taught about the incidents. They'll know which are the big ones, which are the small ones. And they'll also know the risk appetite of certain companies. I think that certainly the financial services companies, the defense industrial base, they can't have a huge risk appetite. The regulators won't let them. Nor will their consumers. But some of the tech companies can have a bigger risk appetite, and you've got to work with those risk appetites as you're thinking through your responses. But legal at the table, but a partner as opposed to trying to say, "No. You'll do what I say." But as a partner is so critical.

Ann Johnson: Yeah. And let's just pull the thread of it. I'm going to make a joke here just to say in the interest of my further employment at Microsoft I would never delay bringing legal in. Anyway but that aside I do think that that partnership starts early. Right? And I do think that you don't want to build, you know -- the old expression the worst time to build a relationship is when you need one. You actually need to build that relationship with your legal team early and which leads to my question, next question for you which is, you know, based on your experience I have this belief that we should treat cyber attacks and cyber incident response the same way as we do natural disasters or, you know, those type of events, right, and I've talked about and blogged about this a lot. So based on your experience how early should legal be involved? Right? If you're planning a cyber strategy and incident response planning to me that's a great time to bring legal in actually to the initial planning. But I'd love to get your perspective.

Erez Liebermann: Yeah. From the beginning is what I would say. If they're sitting there as a partner and they're sitting in your SOC, literally sitting in your SOC which is one thing I recommend for legal teams to have a representative in the SOC, and they're sitting on staff meetings, then they're hearing about the plans and the strategy overall that the company has and the information security organization has. But even specifically for incidents I would say how critical it is for legal to be involved in that incident response plan, working together around that. I'm a big believer that there should be an executive incident response plan which is somewhere in the vicinity of 10 to 13 pages, and that really gives you a high level incident response plan. Legal should certainly be part of that. There should be elements for legal in that. And then there should also be for those teams that want a more tactical incident response plan for the actual incident response tech team that could be a longer one. It could be scenario driven. How do you deal with a business email compromise? How would you deal with a DPRK? How would you deal with ransomware deployment, with a DDoS? And they may have much more tactical steps and much be significantly longer. But legal can add and drive some of those elements, and then certainly be part of the tabletop exercises that help reinforce those plans. We actually see those tabletop exercises being what people remember even more than the plan after we help facilitate those. And I think that by being a part of that -- and look. I've got five tabletop exercises coming up next month for clients all driven through legal helping that. But we're building that between the legal, the CISO, and often the risk officer sitting together working on the scenario. That's how early legal's getting involved and forming that partnership.

Ann Johnson: That is so important. If legal is involved in the creation of your incident response plan you know when to bring them in. I always say in times of stress you don't want people making hard decisions. So if you have a fully baked and tested incident response plan it says, "Okay. Step three. Bring legal in." Wherever it is. Right? Or when you hit this decision criteria bring legal in. And that way you're just bringing -- they've helped create the plan and you're completely aligned in when they come in. All right. Let's move off incident response for just a moment. I'm sure -- something tells me we'll be back to it. But having been in cyber now over 25 years, which is shocking to me, by the way, as an industry I know we make things harder than they need to be. What are some of the most common misconceptions that you see in addition to what we've already talked about between legal and security teams? And how do you propose leaders try to resolve those differences?

Erez Liebermann: I think, and we spoke a little bit about the lanes -- I think there are no lanes in this field. It's a true partnership. And so I don't want to hear CISOs say, "I don't know anything about the law and therefore that's just with you." I want CISOs learning about the law. And I think that I work with many who do. They know the law very well. And I encourage them to opine on that and have a discussion. And likewise I think they should encourage their lawyers to learn some of the technology. And some of the misconceptions I think on incident calls are lawyers should stay on the incident side. They're there maybe for privilege and to advise on notification or law enforcement. But they shouldn't be pitching in on any of the questions with respect to technical steps. They shouldn't be pitching in and asking, "Wait. Did you also review X or Y technical elements?" Certainly the lawyers are going to be the leaders when it comes to the legal. The information security team should lead on the information security side. And there are main lanes, but those lanes should cross over. And when I see people say, "Well, I shouldn't opine on that," that's where we see some dropped balls because we've got some really smart people who are working together. So that is one of the biggest misconceptions I see is about this roles and strict lane keeping. I talked earlier. I would also say that privilege misconception that should be used with thought and not on every single incident. Not every incident requires privilege. Not everyone has legal elements. So thinking through how to escalate that to avoid overdoing that. And courts have said if you privilege everything then all of a sudden we might say, "Well, that's just business as usual." You're not getting privileged protections on your work product. So I think those are some of the elements that we see in terms of misconceptions. Straying a little bit from this, but I have to share one thing that we always are advising our incident response teams with respect to their actions, and we do this as lawyers, but it's a reminder that everything including what goes in Slack or Teams or instant messenger should be treated as front page of "New York Times" when they're working on elements. I think there's sometimes that misconception of, "Oh, I am working and this is getting deleted or it won't be seen anywhere." And way too often we see things that people never thought were going to come to light come to light and get featured in regulatory findings or litigations. So another area to think about. Yeah.

Ann Johnson: Yeah. No. And I don't want to go too far down the path, but we have regular education on the topic because not only understanding what privilege is, but understanding when it is appropriate to use it and not overuse it and those type of things. And I think that I do think organizations should have more rigor. That's what I'll say about that. I do think organizations who have more rigor and training people how to use it, but also when it is appropriate to use it and when it's appropriate not to use it because I think to your point I do think that a lot of organizations run the risk of trying to overuse privilege or -- and it's not -- I don't think it's from a malicious place for the most part. I think it's people don't know. People just don't understand it. They're like they just don't. You know, people who are maybe more junior or not as experienced and they don't understand it. So it's suddenly like, "Oh. Everything I said to my lawyer has to be privileged." Everything. It's like, "I'm going to put privilege on every email I typed up and sent to my lawyer." I've seen that, and it wasn't out of malice. It was just out of people thinking that that's what they were supposed to do. So education is a huge topic there.

Erez Liebermann: Yeah.

Ann Johnson: I want to talk about regulations for a minute because how could we not? So global privacy regulations are evolving. They're not consistent. The Microsoft regulatory team tells me I think there's over 250 a day global regulations that come out somewhere that they're evaluating of whether we have to be compliant with. And we're not a heavily regulated industry. So I think about that always when I think about my banking partners, for example. With the regulations evolving can you talk about how the relationship with security and legal teams should also evolve? And I'm going to throw something at you. You know, there are times I'll go to my legal team and they'll say, "Oh. That's a compliance thing. You have to talk to the compliance people." Which I always find interesting. Right? Because my earlier point on everything having to do with privilege or every comms with the lawyer you mark as privileged I have a tendency that if there's any type of issue, like regulatory, I throw it to my lawyer. He's like, "Why are you sending this to me? It's a compliance thing."

Erez Liebermann: Yes. This great question. First I think that legal, compliance -- so lawyers, compliance, and infosec, and for those organizations that are large enough and have a robust risk department you're working through this together. And so it isn't just oh that's compliance or that's risk or that's infosec. So that's my first -- and I've talked about that. I would say over abundance or for more blunt the ridiculous amount of cybersecurity legislation and regulation is I think harming cybersecurity. I talk a lot with CISOs who say that given the amount of regulations and the importance that frankly they do serve in their organizations field they serve they end up having to run a program that is first compliant with regulations and second, second, doing the right thing based on what they think they need to do for the security of their organization. And that is backwards. Of course. And I understand the bind that CISOs are in. And so I think we need to do risk assessments and those risk assessments should have an element that maps to the regulations. And so you join the legal team in to those risk assessments and you think through as the different parts come in or as a NIST review comes in and you think through how does that map to a regulation. On the financial services side, and I know they're expanding beyond that, the Cyber Risk Institute has done a really great job of doing this mapping. It was an expensive exercise. That's free. Their profile. So I think that's a great place to look even if you're not in that financial services industry. What they've put out is a great model of how to map regulations to controls. And then we need to sit down and have serious conversations as to where the money and the resources need to go so that we're not doing what the regulators think is the most important, but what doesn't necessarily protect the organization because those don't always line up. And so those are very difficult conversations and I would involve the business in those to think through after we've done that mapping.

Ann Johnson: I think that's right. And a lot of the compliance stuff, as you know, becomes a -- really becomes for the security team to implement the controls and also, and you said it earlier in the conversation, it's a risk decision. Every organization has a different risk posture and even their compliance and how they implement controls and how they put in compensating controls is all related to the risk posture. So you have to have the risk people involved with your lawyers and with the cyber team is my feeling on this.

Erez Liebermann: Absolutely.

Ann Johnson: Okay. Can you talk a little bit about, Erez, cross border investigations? Meaning that let's just say if a U.S regulation related to data and an EU one and maybe a Singaporean one conflict and a company does business in all those places is there a standard approach you recommend for folks in responding to multiple regulators on the same event?

Erez Liebermann: That's a great question, Ann. It is a very difficult question, and we deal with it in almost every incident because we are engaged in international and cross border incidents. The first thing is let's understand all those. So as we're triaging the incident in those first few days and trying to understand that let's understand where that impact may be and where that notification has to be first so that we're understanding what that may trigger because you may not need to notify yet. For example in the United States if you have a 72 hour notification in Europe, but would your regulator in the United States be okay with that if they hear that you notified so much earlier to a different regulator whether it was in the U.S or in Europe? So I like to think about relationships as well as the legal obligations. So that's one element that we need to think about. When we're looking at insiders that creates another very significant hurdle because what we're allowed to do to protect enterprises in some of our investigations in the United States might run in to some privacy elements in countries in Europe like Germany and others that place such a premium on individual privacy including in the workplace. So understanding how that might affect your investigation, especially for insiders, is really critical. But there are ways to work around this. There are justifications. You could talk to local counsel in those jurisdictions and understand where the limits are and also that there are justifications that allow you to do it, but making sure you've documented those. And so having those teams together, and I think that's one of the most important elements. So often the home office is driving an investigation and the sooner you can get your local businesses in the fold and you're working with both their incident response and information security teams as well as their legal teams to understand the complexities of this the better it is for that investigation.

Ann Johnson: Yeah. Completely agree. And I would say that it goes back to you and I keep talking about having those relationships already existing, having a plan, knowing what should be part of your incident response plan. Okay. In a data breach here are the groups that we need to bring in. Here's when we need to bring them in. I'm so big about my plan and then working your plan because I've done so many both customer and Microsoft incidents at this point in time and you have high levels of maturity and then you have low levels of maturity and that has nothing to do with the size of the organization. So I'll just leave that thought there. Some people assume, "Oh. It's a fortune X. They must have the most mature best processes." And unfortunately that's not the case. And it's not because they don't have the smartest people. It's just because they haven't sat down and tested a plan. All right. Just a couple more questions because I know we're -- I love talking to you. So I know we're running -- we're running out of time. Can you talk a little bit, and this is another hypothetical and I know lawyers hate hypotheticals, but I'm going to ask it anyway -- how do organizations -- how should organizations be thinking about balancing transparency with legal risks when communicating publicly about a breach? Let's leave regulators out of this. When communicating publicly about a breach what are folks balancing and what advice generally do you give?

Erez Liebermann: I think communications becomes the most important element of a breach after the first couple days. Typically the first couple days are so intensely let's figure out what happened, where's the triage, what's the impact, who's done this to us, and that's on the information security side. And very quickly it shifts in the most important element to that communications side. The element of how do I balance then transparency and legal obligations becomes one of the most important. We used to have a very strong attitude in data breach response of getting our arms around the breach, being able to explain what happened here, and then going public and explaining all of that. And some of that may have stemmed from the Target breach, for example, in which Target actually tried to do transparency early. They tried maybe to do a little too much transparency early and so they identified the amount that they believed was impacted. Turned out to be wrong. They came back with another number. It still turned out to be wrong. Came back with another number. And so you learn from then. You say, "Well, I shouldn't share this much data early on because what if I'm wrong and how embarrassing is it." And you lose credibility. For a while then we trended away from that early transparency and then the public really started I think to ask for more of it. Third parties and customers started to ask for more of it. And they became much more understanding when you said, "I'm on day four of an incident. I'm on day one, day seven. I'm still working on this." They know now in the public and they know now with your third parties that answers don't come at the snap of a finger even if you bring in Microsoft's fantastic incident response team, other teams. There's no answer immediately. And so it's going to take time. And with their understanding that you don't have those answers comes a request, but at least give us a heads up so we can think about that. I think that's pushing a lot of investigations to have that transparency earlier. There is a dammed if you do dammed if you don't element here for sure because sometimes you give that early transparency. Then it takes you three more weeks to actually get numbers and the client or the customers are very frustrated with the lack of additional data for those three weeks and you almost say, "Well, if I told them three weeks later I would have gotten a little bit of yelled at for not telling them earlier, but at least I would have been able to be helpful early on." I've even heard a client of a client say, "Well, now you told us early. We know we had an issue. You don't have enough information for us to tell a regulator. You've actually just created more of a headache for us." So we've -- we've gotten it from all sides. And that's just the nature of the beast. But we are seeing I think that push for earlier transparency. That of course goes hand in hand with what the regulators would love to see. You see the 8K rule which future I think is a little bit in question. There was a petition we helped work on from the financial services industry pushing back on the 8K rule and a recent bill that said, "We'll fund the SCC, but not fund any work towards the cybersecurity risk management rule." So we'll see where this takes us, but that rule clearly is looking for earlier transparency, especially on big incidents. I think this is one of the most fascinating parts of incident response and I love working with crisis comms, internal teams, and crisis comms firms as we think through these issues.

Ann Johnson: I think that from my point of view, and I'm just I'm going to deliberately skip the -- any of the regulatory note reporting requirements, my perspective and what I tell folks here because I, you know -- I sit on that, on top of that org, right? What I tell folks is we need to be as transparent as possible to help customers protect themselves. That is the information we need to be putting out there. It's not necessarily the exact nuts and bolts of something that's very proprietary and confidential to Microsoft that will never help someone protect themselves. You know, over time all of those things will be disclosed, but the urgent comms in an event notwithstanding regulatory reporting requirements are anything that will help a customer protect themselves. And that's what we need to be putting out there as quickly as possible, as thoroughly, and of course as accurately. I also explain because I spend a lot of time, as you know, talking to customers that there are times we're just in the middle of the fog of war. So here. I'm going to give you this information. It may be slightly imperfect, but it's a minimal set of information for you to understand there's something happening out there and here are the steps we recommend you take to protect yourself. Please understand these steps may change. Right? They may. We may add to them. We may modify them. But this is from -- it's so important for people to understand that cyber investigations at least at the very beginning are just like any other. You know? It's just like a homicide investigation. It's really messy. And we're putting the information out there. It's the best information we have at the time, and it is subject to change. So that's my principle. I hope that makes sense.

Erez Liebermann: I think it's really great advice, and so many in the community truly believe in that information sharing. The FS-ISAC, other ISACs, have done such a great job and I love your point, right. If you know something and you can help protect others, share that as quickly as possible. Build that transparency even if it's CISO to CISO and you're working through some of that. Really great. And so appreciated with clients when they get that information because they know you're looking out for them even if they know, "Well, this is going to cause headaches otherwise" when you come early. So appreciated.

Ann Johnson: Thank you. We only have time for a couple more questions. So the first thing is I want to talk about culture. We know that culture plays a big role in building teams. We try to infuse a security of culture at Microsoft. One of my other jobs that I actually should talk about more is the education and awareness team for cyber sits under me so we are actually completely evolving our training and our education awareness and hopefully leveraging AI and being more adaptive to what people need. So the individual training experience it's part of -- there's other things we're doing to drive culture, but that's just one of the things. When we talk about culture though it's a big topic so if you can boil down what cultural shifts you think are necessary to really drive a culture of security and a collaborative environment in organizations between -- particularly between legal and security, I'd love to hear your feedback.

Erez Liebermann: I think that I mean culture is obviously the most important. It's important for a great place to work, and it's important for that motivation and the sharing that comes with having a great place to work. And so how do we promote that? One of the fears in cybersecurity, I feel, is blame. There is a fear that if we see this incident there's going to be that question who's responsible for this. Whose fault is this? Why did that hacker get in? Right? That blame creates I think a culture of fear and one we need to get away from. And so giving the confidence to the information security team that we want the team to do their best job, we expect them to work as hard as they can to find innovative ways to protect the company, and that if they do the great work that they do and there was a gap, there was a mistake, we learn about that, that does not mean that we're coming with a blame game at the end of that. I think frankly the legal mindset, the regulatory mindset, started with that. Years ago the FTC would say any breach is a sign that you don't have reasonable security, and they would start blaming the information security teams on that. We used to see I think CISOs fired in a pretty high clip after incidents and they would move on. And I think we've done a really good shift and need to continue that, when we see incidents not to blame, to understand that it's a really tough proposition to defend an enterprise. It's really tough even in the smaller role based positions to defend. And to provide that culture of security, job security, even in the face of incidents will promote that. I do incident response all the time. I still hear questions. Whose fault it is. We need to get to the bottom of fault. I say we do need to understand root cause, but not talking about it in the fault perspective so that we can have much more open collaboration with the teams and no one's afraid to speak up in that sense.

Ann Johnson: Yeah. I will tell you that it's -- because it's my life philosophy that systems are at fault, not humans. So if you have a good system. I said to somebody recently, I said, "Even the highest rated car manufacturers have defects and recalls." Cyber isn't going to be a perfect thing. You need to look at the system. And I agree with you. We're never going to get people to be more transparent if they think that they're going to be blamed and if they think there's going to be - outside of negligence, you know. Most cyber events are not negligent. They're just human error or something was unpatched. Whatever. We could write a novel on what causes most cyber incidents. But I think if you want people to be transparent you have to get out of that blame mentality. And I do think the industry is moving that way. With that, we always close "Afternoon Cyber Tea" with optimism. So given everything we've talked about today I'd love to know what you're optimistic about when it comes to the future of cybersecurity.

Erez Liebermann: I think we're in so much of a better place in terms of maturity and cross functional work. The boards I think have really upped their interest in cybersecurity, and that is driving a lot of really fantastic management interest and all the way down to support for the information security teams. So that gives me a lot of optimism in the future and the future of the cyber roles and the respect that we have for the cyber organizations and the seat at the table to think of it as more than just security, but a critical business function. I think frankly boards can do more when they have questions and not just ask, but think through and ask some tougher questions, but I love that they're taking such an interest now. And that gives me that optimism across the enterprise. When someone says the board's interested in something it opens pathways and I think that these will continue to open pathways. I am hopeful that the information security teams are able to leverage AI faster than the bad guys are. The hackers are certainly going to leverage that. So I'm optimistic about that work. I saw some really great discussions around that at the RSA conference. I know that the hackers are also thinking about that and how they're going to use agentic AI for their own means. So we'll see where that all leads, but the team approach and the management's interest in cybersecurity gives me great faith in where we are as an industry.

Ann Johnson: That's amazing to hear. I know you're super busy. I appreciate you making the time to come on the podcast. I really enjoyed speaking with you at RSA and hopefully you and I have another opportunity to do something like that in the future. Thank you so much for joining me today.

Erez Liebermann: Thank you, Ann. Really appreciate it.

Ann Johnson: And many thanks to our audience for tuning in. Join us next time on "Afternoon Cyber Tea." I invited Erez on the show because the legal complexities of cybersecurity are growing and he truly is an expert. There's no one better to talk about the nuances of the topics like him. From responding to data breaches to the acceleration of global regulations this conversation is packed with insights for leaders across the cyber legal spectrum. It was great to talk to Erez and I know our audience is going to benefit from his insights. [ Music ]