Afternoon Cyber Tea with Ann Johnson 3.3.26
Ep 126 | 3.3.26

Why Cybersecurity Fails Without Trust

Transcript

Ann Johnson: Welcome to "Afternoon Cyber Tea" where we explore the intersection of innovation and cybersecurity. I'm your host Ann Johnson. From the front lines of digital defense to groundbreaking advancements shaping our digital future, we will bring you the latest insights, expert interviews, and captivating stories to stay one step ahead. [ Music ] This week on "Afternoon Cyber Tea" I am joined by George Finney. George is a cybersecurity executive, CISO, and author known for his practical leadership focused approach to zero trust and enterprise security transformation. Among his books are "Project Zero Trust" and "Rise of the Machines" which both help articulate complicated frameworks in a simple and a practical way. Welcome to "Afternoon Cyber Tea," George.

George Finney: Thanks so much for having me. I hope it's okay I only got coffee, not tea.

Ann Johnson: That's fine. That works. It's morning my time so anything with caffeine is helpful. So it's great that you joined me today. You've obviously led security in environments where openness, trust, and autonomy aren't just values. They're foundational to the institution itself. And universities themselves are fascinating environments from a security perspective. They are designed to be open by default. They're open to ideas, to collaboration, to research. And yet universities face the same threat pressures as many large enterprises. So how does all of that reality shape the way you think about security leadership?

George Finney: I think that commitment within higher education to transparency and openness I think helped change a little bit of the way that I think about security. I think we take some things in security for granted and I've had to kind of shift the way I approach things just to challenge some of the accepted notions that we've had. For me that's really resonated with my leadership. You know, when I'm talking to my general counsel, our auditors, or my CFO I think they appreciate seeing the full picture on why we're doing things. And, you know, just using the standard fear, uncertainty, and doubt I mean it was pretty clear that was never going to work. So, you know, I think instead building those relationships and trust, particularly over time, I mean higher education is a little unusual that our leaders tend to stay for a lot longer than other industries. So if -- I mean if you burn a bridge that's going to impact your program for a long time. So that has been influential in the way that I think about cybersecurity culture and hopefully that's a good thing.

Ann Johnson: I think so definitely. I will be candid with you. I haven't completely read your books, but knowing you were going to be a guest I did a little bit of research on them just to understand them. And I think that when you take that philosophy and your approach it's something that will make cybersecurity better for the masses or more palatable for the masses. People are afraid of it. Right? So that fear of cybersecurity, the more that you can actually make it appear open is something that will actually make us more secure. I keep saying that we won't need cybersecurity departments anymore because cybersecurity is everyone's job. Right? We'll need some expertise, but cybersecurity should be everyone's job. And the more we do that and the more you advocate for that I think is a great thing. And zero trust certainly lends to all that.

George Finney: I think you're spot on. Folks are scared enough about cybersecurity. I mean they see it in the headlines every day. I had to reach out to one of my department heads. This is several years ago. You know, there was a vulnerability. It was an issue. And we needed their help in fixing some things. You know, just the tone of your voice matters. I was very curt, shall we say. You know, we needed to get this done. And I think I was anticipating some push back. And so I took a tone that had I not already had a relationship with the person, you know, she pulled me aside after we had fixed everything in the incident and she was just like, "Hey, so, you know, I noticed this was a different George that I was hearing." And, you know, had we not known each other that might have become a barrier for us working together moving forward. And when you think about really important things like zero trust, after having written the book and toured all over the world, you know, talking to security leaders about it, the common denominator that I've heard from other CISOs on why their zero trust projects failed was people. It was politics. It was communication. It was silos. And, man, you know, we don't talk about that enough in security, about, you know, how do we break out of just the tactics and really make progress in a much bigger way? And I think all of that comes back to people and relationships.

Ann Johnson: And communication. Right? I worked with a CISO at one point in my career that I don't -- as a child I've read this book in school called "The Boy Who Cried Wolf." And then suddenly everyone stopped listening to the boy. And I worked with a CISO once that everything was a crisis. Right? And at some point the executives just started tuning out because they were like, "Okay. Everything isn't -- can't be a crisis." Right? It's just they just stopped listening to that person. So how do you explain cyber risk to boards and executives when they're hearing a lot of noise? They're hearing a lot about risk. The answer isn't more control, but it's better communication. It's better judgment. It's better accountability. How do you land those messages in a way that people are listening to and they're not just tuning you out because they're like, "Oh. It's just another problem."

George Finney: You know, I love this question. You so often hear folks talk about, you know, just basic things like stop using jargon and acronyms. But I think what people hear when they hear that advice is, "You've got to dumb it down." And I got in to a bit of a debate with one of my CISO peers at a large organization a few years ago and, you know, they were basically saying their board is stupid. And I'm like, "Do you really believe that?" And I think when you work with executives, leaders, or boards they are very talented. Right? These are very intelligent people to have gotten where they are. I don't think the answer is to dumb things down. I think you need to make it approachable and connect with them. One of my board members was he was -- I don't want to name the name, but everyone has seen him on TV at some point. And, you know, I was giving a presentation about physical security and we at a university ran a what we call a bait bike program. So one of the most common things that gets stolen off of college campuses are bicycles. So our PD had a bicycle with a GPS in it and they would all get a text any time, you know, the bicycle moved. And they would all kind of descend on the area. And, you know, our bike theft problem went way, way down. But I'm talking to a billionaire. Right? That is a household name. And this resonated so much because when he was in college his bicycle was stolen. So when you can connect that back, whatever people's backgrounds are, I mean the security matters to people. It's a part of who we are. I mean it's a part of Maslow's hierarchy of needs. It's the foundation of who we are as people and how we set ourselves up for success and how we set our organizations up for success. People get that we need security and I think we, you know -- making it not just relatable, but making it approachable, right. And that's what I've done with the books. And I think that's the most special thing that I've gotten from writing "Project Zero Trust" and "Rise of the Machines" is that it's a -- it's a story. And when I talk to folks, whether they're brand new to technology or whether they've been in security for 30 years they get something out of it. And it's because I've made it approachable that I can go in to the super technical deep dives and connect those dots. I think that's probably what got it -- this last year "Project Zero Trust" was inducted in to cybersecurity canon hall of fame. And I think that's why that message kind of connected with so many different people is I made it approachable.

Ann Johnson: I missed, by the way, the cybersecurity canon hall of fame. I used to be on that group and then I had to tell Rick I just don't have the time to read and review all the books. It was such a wonderful group. And, by the way, I use the Maslow reference a lot. So it's lovely to hear you say it seriously because it's like folks, you know, shelter and safety are priorities one and two. Right? So, you know, you can't get to anything else unless people feel safe. And cybersecurity is a really good way to emotionally connect to that conversation. And to your point, if you can -- with my like my family who are not cybersecurity professionals I connect a lot of the reason we have, you know, locks and cameras and alarm on the house. Now think about defense in depth with cybersecurity. Yeah. If you can make the connections human, you don't have to dumb it down. You just make the connections the same as, you know, what people encounter in their everyday life. Right? And it helps. It really helps connect. So it's lovely to hear you talk like that. Can we talk a little bit about risk and governance and boards? So the one thing about boards is no matter what type of organization you're in boards are focused, to a certain extent, a large extent, on risk. Right? And when new technologies emerge, everything from -- I was talking last night to a friend about the emergence of AI versus the emergence of cloud in the middle of a sort of boring football game. So it was we got in to a technology discussion because there wasn't a whole lot happening on the field. And one of the comments, observations, I made was that people are not always early adopters. And with AI there's certainly some good use cases. And I remember the early days of cloud. Right? So there were a lot of people that were like, "Oh, it's way too much risk to, you know, trust somebody else with my infrastructure." Right? Regardless of what the cost savings is. And then governance and risk lagged that adoption. So can you talk just a little bit about how you talk to board about governance, how you talk about risk, and particularly in terms of newer technology and how they should be thinking about it so they're confident in asking the right questions?

George Finney: I love that parallel with cloud. Right? It's AI is really along the same journey. So with cloud, right, a lot of the security community was doing exactly what you said. Right? We were saying, "Don't do it. It's not secure. We're not ready." Etcetera. And, you know, I think when you talk to a business leader anybody who's gone through an MBA program has been taught something very specific. They all think this way. Right? They understand that risk equals reward. Right? When we talk about risk like, oh, you know, there's risk in moving to the cloud, right, your CEO or your board member is thinking, "I got you. Yeah. Let's do that risk because we're going to make some money out of this." And I think we've got to have a different way of talking about things. Right? So with cloud it did take years for the technology on the security side to be able to go and address some of those gaps. Right? The visibility that we lost from a security perspective was no joke. Right? Losing that visibility like hampered our capabilities in a lot of ways. With AI again the conversations that folks are having with Chat GPT, man, it's really hard to get logs out of Chat GPT. It's not necessarily intended as an enterprise product. Some of the challenges with Copilot, right, you need to be able to assure that your leadership, your executive team, that you're protecting your data. Right? Do we have a good data governance program before we roll AI in? And, you know, does the AI get access to different things? What does it get? Oh my gosh. So I've started talking about it in a slightly different way because I don't want folks to key in on the risk reward kind of thing. I talk about danger. Right? There are some things -- not everything in AI, but there are some things that are an existential threat to the organization. And we've got to find a way to talk about those and connect with folks to change the narrative. There are some differences with the AI versus a cloud kind of parallel. I think more so than ever security teams, CISOs, have a seat at the table. We're now talking to leaders about AI where we maybe weren't even included in the conversation like 15 years ago. So that's really good news. So "Project Zero Trust" and "Rise of the Machines" tell a story. Right? That's a fictional company with characters. It's kind of a case study in doing zero trust. And then "Rise of the Machines" is, you know, the same company. They now have to respond to AI and figure out how to apply zero trust to all of these different LLMs or what have you that are out there. And what I kind of realized coming in to it was the problem today is that it's like the old sci fi saying. Right? Any sufficiently advanced technology is basically magic for people. We can't think about AI like it's magic. It's super complicated math that hardly anybody understands including me, but I think if you can break it down in to something maybe more understandable or approachable again that starts to help us have the conversation about how we can use it, how we can leverage it without the danger aspect or elements. So I use an analogy for understanding AI in "Rise of the Machines." And essentially you already know how AI works because it works like a restaurant. Data are the ingredients. Right? When it comes in the back, your models that essentially are recipes. You might come up with your own brand new recipe or you might use someone else's. That's just like the frontier models versus maybe some of the small language models that are more curated. They're tools. Right? You might have a fancy pizza oven in your restaurant that you imported from Italy or you might just have a fryer if you're a hole in the wall burger joint. But depending on the tools you have that kind of dictates the kind of restaurant you're going to be and generally speaking you like to keep the customers out of the kitchen. And there needs to be infrastructure to separate that out. Right? You know, there's guardrails. But there are AI firewalls now that you need to understand how to operate those. And most of us aren't actually even operating a restaurant. We're doing like Uber Eats and getting our AI delivered to us through SAS applications or embedded in our existing tools. So thinking about that big picture, right, we know the bad guys are attacking the restaurants. They're stealing the ingredients. They're stealing the recipes. They're manipulating the tools or, you know, Yelp reviews or whatever to get us to change the way we're doing things. Or oh my gosh. This is such a great analogy. Now I can think about well I just need to integrate these AI things in to my existing security stack and figure out how to address the underlying like lack of visibility. That's the fundamental issue. It's the parallel of the cloud because if we don't see it we can't secure it. And oh my gosh. That has just really revolutionized my conversations with the leaders that I work with in my organization.

Ann Johnson: I think that's fantastic because again you've broken it down I have to say in to ingredients that they understand. Silly pun intended. But, you know, the old expression. How do you eat an elephant? One bite at a time. It's the same thing with all of this technology. And it makes it again you're speaking in a language they can understand instead of talking about all the things we talk about like that nation chambers and all those lovely things that, you know, people - terrify people about cybersecurity. Let's talk about zero trust then in more depth. Zero trust unfortunately has become a buzzword. And there's a little bit of exhaustion around it. However if you again take it to the level that a board or executives understand they care about their exposure. They care about the impact. How have you seen security leaders successfully message zero trust to explain cyber risk in a way that resonates beyond, you know, just the security team or the technical teams?

George Finney: I would say, you know, besides helping me launch an award winning book I think it helped me get my current job. I think that message resonated so much because I was able to articulate security in a way that maybe executives hadn't heard about it before. You know, zero trust, the definition I use -- and I collaborated with the gentleman who created zero trust to write the book, John Kindervag. He just happens to be a friend of mine. He lives just a couple miles away from me here in Dallas. But zero trust is a strategy. It's a strategy for preventing or containing breaches. And we remove the trust relationships that we have in digital systems to effectuate that. Right? We know from studying the bad guys that trust is the thing that they exploit to get in to different systems. But when I talk to a leader, a CEO or board member, and I talk about strategy, right, how am I going to be successful at this job that I do that they don't necessarily really fully understand? They do understand strategy. Right? They understand to be successful in any part of the organization I'm in that you need a strategy for success. How am I going to get there? How am I going to measure those things? Okay. Those are all things that we can start to talk about, but you know that gets us out of like the down in the weeds tactics conversation about oh I need this new tool. It's another, you know, budget ask to go address this risk that maybe is real or we don't really know what it's going to look like in a few years. It's going to evolve. So instead if I can reframe the conversation here's how we're going to go about doing the thing. So this is the goal, preventing breaches. This is the path to get there. I can connect everything back to that. And even more than that a strategy is really about getting multiple different groups to work together towards the same goal and man if I can get -- create a cohesive cyber program that not just includes security and IT, but I can involve my accounting group and I can involve legal and, right, HR, right, all of these groups already do security in some way today whether it's HR doing background checks, whether it's audit preventing fraud, or whether it's, you know, accounts payable, you know, not falling victim to business email compromise or identifying suspicious activity. All of these things are already security functions. They're not on my team, but they're a part of my team because, like you said earlier, security's everybody's job. That you want to bring in a strategic leader who can get all of the things working together in concert, that's really transformative when you're becoming a cybersecurity leader. I think the other folks that are talking about tools and tactics and threat intel and other, well, that's not something that's actually going to resonate as much with leaders inside different organizations.

Ann Johnson: Makes perfect sense. Can we talk then just a little bit? We've talked about what works. Now let's talk just a little bit about what do you hear from your fellow peer security leaders that they actually unintentionally weaken their message when they're talking to executive audiences or they're talking to boards?

George Finney: This is my favorite question, Ann. I'm going to get up on my soap box. I haven't been up on my soap box yet, but I don't know if you know this. We have a secret motto in the cybersecurity industry. Like everybody says it like it's gospel. What do we say? We say people are the weakest link. What are we doing when we're telling that to other people? We are undercutting our own message.

Ann Johnson: We're offending them too.

George Finney: We're, you know -- when you talk to any CEO, right, what do they say the organization's biggest asset is? They say the people. Right? They can't get the job done without people to get it there. And so if you come at them saying "Man, if we could just rid of all the employees in our company we'd be great. We'd be totally secure." Well, that's true. You know, we could also unplug all the technology in our organization and we'd be perfectly secure. We're not going to do that. So man. I think shifting that message, and I like to say people are the only link, right, because it's not just, you know, technology. Right? It's people using the technology. It's not just process. It's people following the processes. And so when you can connect all of that together and talk differently about security I think that does resonate. I definitely have seen that in my own career. I've seen that with some of the folks I work with. I think the role of the CISO itself is evolving away from being the technical person that is down in the weeds so 5 or 10 years from now I think that evolution is going to continue. I think that's where we're going is really having the CISO be, you know, more of a conductor, you know, conducting all of the things in the organization. I think of myself more like a coach. I do things like a coach. I drill folks. I train them. I put them in the right roles to be successful. And, you know, at the end of the day, right, the team is the one that play's the game. You know, I'm just maybe calling the plays, but yeah I think you can have different elements and have them come together to be successful. So anyway I think that to me is one of the biggest issues is getting away from that mindset that people are the weakest link or people are the problem. And finding a different way to frame that.

Ann Johnson: Yeah. It's actually awful to articulate that way. We talk a lot about digital empathy which is the concept, one of the concepts in cyber -- is that, you know, if your systems are so weak that one human being clicking a bad link causes, you know, a wholesale outage, then it's not a people problem. It's a systems problem. Right? And that's how we try it. The systems need to be empathetic to the humans that are using them. And they need to be resilient to the humans that are using them. So I've been trying to get that messaging out. It's great to hear you say it too because humans are the weakest link is actually reasonably offensive to people and it doesn't make them want to work with the security organization certainly. So looking back on your career what's one insight about leadership or risk or anything that you wish you had understood earlier?

George Finney: Gosh. I've just been so lucky to have had some influential leaders in my career that have helped me kind of become, you know, who I am today and to challenge that. And I think culture is so important when it comes to organizations and man I'm an introvert. Talking to people is draining. I can do it. Obviously I write books and I go on podcasts and give speeches, but it's a challenge. And I think you've got to build that culture that's supportive of folks so that they can reach their full potential. But, you know, really seeing that and believing it takes that experience. That really I think shapes a lot of who I am and, you know, I think I was always relatively driven, but being able to take risks in your career is also really important to, you know, doing different things that maybe are outside of your comfort zone. And in many ways in security we have to be dedicated to lifelong learning because we're having to secure the tech that's right on the bleeding edge whether it's AI or whatever the next new thing is. So for me that's been one thing. Like oh my gosh. How do I -- how do I stay current? I can teach classes. Right? That helps. Writing books is a part of my own professional development. But, you know, kind of embracing that has I think changed the game. But again it was -- it was about that early culture that I had, you know, here in higher education that was really supportive. Well, one of the reasons that I moved over to higher education was so I could get a law degree. And I'm a lawyer today, but I don't practice. But I think embracing that and feeling like it's something you can go do and it's something that you'll get benefit out of even if you don't practice law I think thinking about that is huge. And especially in security we really do have to understand all of the things about an organization. And, you know, it just takes a lot of time and dedication to get there.

Ann Johnson: Well, George, we're just about wrapping up. And every time we wrap up "Afternoon Cyber Tea" we like to wrap up with a bit of optimism because despite all the issues and challenges we have I am always optimistic about the future of cyber because I know for everything you see in the news there's thousands of things that have been blocked by cyber professionals or detected. So tell me what you're optimistic about with cyber today and the future of cyber.

George Finney: I don't want this to be an anti AI conversation so I just want to say I love AI. I use it every day. I have a kid at home and she came and asked me, "Hey, dad. Can I use Chat GPT on this project that I'm going to work on for school?" It was like extra credit and she didn't have to do it, but of course she came to me the day before like most kids do. Right? And I was like, "You know, you can use Chat GPT, but we're going to spend the same amount of time that you would have if you would have just done this by yourself." So I mean we probably spent, you know, four or five hours on it, but at the end of the project she wrote a choose your own adventure story. And it was the assignment was like four pages. It ended up being a 16 page choose your own adventure story with full illustrations. And at the end of it, you know, I mean it was a good five hours. But I was like I would have been proud to have written this like as a senior in high school. And we produced it together. You know, she was in like the third grade at the time. So, you know, I would say I love AI, but I think it's up to us to challenge ourselves to do even more. You know, I mean I look at, you know, kids going in to college today and you can take a business course for one semester and completely stand up a business not just like with a business plan like maybe, you know, you might have 10 years ago, but you would be able to build a product and a website and a marketing plan. And the power that we have today is only limited by our own initiative and imagination. And I think I'm really optimistic that we're going to be able to unleash an amazing amount of creativity to the world, but it starts with us and aspiring to something higher. So that's what I get optimistic about.

Ann Johnson: I completely agree with you. AI is a great tool, but it is one of the tools, and we have to treat it like that. It is another tool. It's a very powerful tool in our arsenal. You know, the funny thing, you were talking about your child. I have a daughter who's out of college now and I was -- we were literally having a conversation the other day about the fact that she doesn't write in cursive. She went to Catholic school so she certainly learned to write in cursive, but she doesn't write in cursive. But the other thing is she cannot read time on an analog clock and I never even realized that until recently because she's never had to. She's never had to read time on an analog clock. She's like, "I don't know what that says." And I'm like, "I've told you over the years." She said, "Yeah. I tuned it out because I've always had some type of access to a digital clock." I'm like wow. Is that a skill we're going to really miss? Right? I don't know. You know, things change. Right? We don't use abacus much. You know what I'm saying? So I don't know if an analog clock is a skill we're going to need in the future. But I do know that AI is a very powerful tool, but it does have to be harnessed, and I think the best quote I have is from our deputy CISO for AI who told me AI is a toddler. He said, "And as long as you remember that AI is a toddler and you have to harness the power of it you're going to be very successful with it." He said, "It will grow and mature, but you just have to learn how to use it." Well, George, thank you so much. This has been a wonderful conversation. I appreciate you joining me and hope you have a great rest of your day.

George Finney: Thank you so much for having me.

Ann Johnson: And to our listeners, thank you so much for joining us on "Afternoon Cyber Tea" and join us next time. [ Music ] We asked George to join "Afternoon Cyber Tea" because he is a subject matter expert. He also has a very human approach to cybersecurity and makes it really practical for any audience so you're not always communicating in technical terms. It was really a pleasure to have him on. Very engaging. And I was just thrilled. [ Music ]