
The Psychology of Speaking Up with Amy Edmondson
Ann Johnson: Welcome to "Afternoon Cyber Tea" where we explore the intersection of innovation and cybersecurity. I'm your host Ann Johnson. From the front lines of digital defense to groundbreaking advancements shaping our digital future we will bring you the latest insights, expert interviews, and captivating stories to stay one step ahead. Today I am joined by professor Amy Edmondson, the Novartis professor of leadership and management at Harvard Business School. Amy is not just an expert in psychological safety in the workforce. She is the expert. An author of 7 books and over 60 papers, Amy's research examines team and leadership dynamics, the emotional skills today's leaders need, and how to build fearless organizations in any industry. I've been looking forward to this conversation for quite a while, Amy. Welcome to "Afternoon Cyber Tea."
Amy Edmondson: Thanks so much for having me.
Ann Johnson: So hot topic. There's a lot going on in the world right now. People aren't feeling necessarily psychologically or emotionally safe in their lives or in their workforce just because things are a bit unstable. So could we start just by talking about psychological safety because it's a word that's used a lot in organizational development? Yeah. It's a word that's used a lot in teams. But I'm not sure people always use it properly. So how do you actually define it and what does it look like in today's organizations?
Amy Edmondson: In fact, Ann, I'm quite sure people are not always using it accurately. And not their fault. The term itself has a kind of implication of comfortable and cozy and nice and that's just not what it is. So let me first give you my formal definition of psychological safety. It describes a climate in which people believe their voice is welcome, where they believe they can take the interpersonal risks of speaking up with an idea, a question, a concern, a mistake, a dissenting view. And not that it will easy and fun all the time. It usually isn't. But that they believe it's welcome. They believe it's what we do around here.
Ann Johnson: So when you think about that -- by the way, I keep coming back to and I hope you appreciate the humor. Don't know if you've seen "Princess Bride" but I keep going back to "I don't think that word means what you said it means."
Amy Edmondson: That's right. I haven't seen it in years I have to say.
Ann Johnson: Yeah. That phrase just always sticks in my head when -- because in cybersecurity there's a lot of words people use that they don't actually know what they mean. But anyway. Back to psychological safety.
Amy Edmondson: That is related to psychological safety because it's that, you know -- that desire to look good rather than to be good and to lean in to the work is partly driving that reluctance to ask, "Is this what I think it means?" You know, that people are just so caught up in self protection and impression management.
Ann Johnson: Yeah. I remember a manage -- I was really young in career, but I remember a manager saying to me that old expression it's better to be, you know -- be quiet and thought dumb than to speak up and be proven or something like that.
Amy Edmondson: That worked fine in the industrial era. Right? It's complete bunk in the knowledge era.
Ann Johnson: Yeah. And in cyber we talk about early signs. You know, the earlier you can find something you can detect something. You can avoid it and prevent it from happening. Can we apply that to psychological safety? Are there early signs? And how does a leader look for those signs?
Amy Edmondson: It's so relevant and many of the case studies that I have done, you know gone in to great detail on say the Columbia launch failure of 2003 or, you know, many other sort of real disasters, were literally avoidable had people spoken up in a timely way. So it's I can't tell you how much I think about and value the speaking up about early warning signs. So, you know -- so psychological safety actually describes an environment where people are willing to speak up about early warning signs. They're not worried about, "Oh. How do I look?" They're like, "This could be nothing, but I'm going to raise it. I'm not going to be afraid of being called Chicken Little. The sky is falling when of course it isn't." But so I'm much more interested in that topic. Right? That people can speak up about early warning signs of a potential break down or failure. But early warning signs about psychological safety, you know, and whether or not it's present, I don't think about that quite as much. But to free will a little it's basically I think in today's, you know, complex turbulent world an early warning sign is a sign that doesn't happen. It's the bad news, the questions, the dissent, the mistakes, the failures that you're not hearing about. So if you are a leader of a team and you're hearing an awful lot of good news, you know everything seems to be green and nothing seems to be red, that is probably a warning sign that you don't have enough psychological safety because it just can't be the case that things aren't going wrong or that people don't see things differently. But it can be the case that you're not hearing about it.
Ann Johnson: Yeah. And you talked about -- I'll go off on a tangent for half a second. You talked about Columbia. I'm going to age myself here. I went to college in northern Utah and actually was in the student union having breakfast and watching the Challenger launch. We were all excited because we were -- a lot of the kids' parents worked at Morton Thiokol. And we were all incredibly excited and, you know, reading back on actually how that all happened it's another example of where it cost human lives because people didn't feel they could speak up.
Amy Edmondson: Exactly. And in fact I recall exactly where I was when I heard about the Challenger disaster. I was in the New York Public Library and I was there doing some research which tells you how very long ago it was. And, you know, I called a friend to, you know, check in at work. And I learned that this had happened. And so I remember it. I remember it so vividly.
Ann Johnson: It's things like that, but also I want to take it back to now cyber for our listeners. I had a guest on a few years ago that talked about mental burnout because we operate constantly. Right? This is a constant high stress, high pressure environment. It increases if there's an incident. There's plenty of people wanting to point blame. But when you operate in an environment like that can you talk about what leaders can do proactively to create more safety in a continually high stress environment?
Amy Edmondson: Yes because this is in fact the kind of environment where psychological safety is most important. And, you know, maybe ironically when leaders call attention to the fragility, the complexity, the ever present potential for break down, that makes it more psychologically safe, not less, because fundamentally it makes it discussable. It makes the reality of the situation discussable. And when leaders don't do that people will naturally assume or think of the situation in the old fashioned way, the conception of the work environment where people are supposed to hit their targets and always do a good job and expect certainty and be perfect. Like that is not the world that cybersecurity professionals live in. So when leaders call attention to that reality, to what's at stake and how much very real uncertainty and complexity and interdependence there is, that gives permission for people to speak up about it. You're saying, "We should expect things to go wrong." Right? The only real question is will we hear about it. Will we hear about it in a timely way?
Ann Johnson: Yeah. And what can we do about it in advance? But also I think, you know, we talk about breaches and events and we talk about the fact that they're going to happen. The -- it's how you respond and recover. And I would imagine the same thing applies to when you're creating your organization. Right? Things are going to happen. How you respond and recover sets the tone for the organization.
Amy Edmondson: Exactly. And, you know, I did some of my early research in the hospital environment and it's an echo, what you just said. Things are going to happen. There will be break downs. There will be coordination and communication break downs. There will be unexpected responses to medications that we haven't seen before. Right? It is a given. Healthcare is a complex error prone system. That's a given. But by naming it, and by naming it early and often, it gives people permission to be part of the catch and correct system.
Ann Johnson: I like that, the catch and correct system. Well, let's talk about failure. You wrote a book called "The Right Kind of Wrong" that talks about the difference between intelligent failures, basic failures, and complex failures. Can you talk a little to our audience about that framework? What does it mean? What are the differences? And how do you respond when something goes not to plan?
Amy Edmondson: Well, it does depend. Right? So let me describe the framework first because I think the framework helps us answer the last question which is how do you respond. I think first of all you always want to respond in a productive learning oriented way, but let me clarify. So the three kinds of failure. Basic failures are when we accidentally or, you know, unintendedly deviate from a known practice or process or rule. So and unintended is an important attribute here because it's not -- if it's an intended deviation that's sabotage or that's, you know -- that's mischief. Whereas mistakes are, you know, unintended deviations in familiar territory. And some mistakes will lead to a failure. Fortunately some mistakes don't. You know, we just we get lucky. But when a mistake leads to a failure or it's a basic failure, it's a failure in familiar territory with a single simple cause, honestly we should in good organizations, well rounded organizations, high performing teams, we do our very best to reduce basic failures to zero. And it's doable. Right? It's vigilance. It's supporting each other. It's speaking up. And we can -- we can make those go away in sort of high reliability organizations. Complex failures are similar in that they also occur in familiar territory where there is, you know, for the most part a right way to do something. But these are the perfect storms. These are the system break downs. These are the failures happening because not one, but a handful of small deviations come together to lead to a break down or failure. I imagine that many cybersecurity failures are complex failures. And they happen, they're on the rise, because of the complexity of our systems. There's so many ways things can go wrong. And again we can do our best to catch and correct. When people are really vigilant, really alert, catching and correcting the small errors, the small deviations, before they come together and cause harm we really can operate as you know better than I at a high level of up time, at a high level of failure free operation. Now the third kind is really different. The third kind is what I call an intelligent failure and that is an undesired result of a thoughtful foray in to new territory. So it's actually an experiment, kind of a deliberate experiment. I have -- I'm trying something to see what will happen or work in new territory. And I hope to be right. I even have good reason to believe my experiment might be right. But alas I'm wrong and I get a failure. So scientists, inventors, live and die by intelligent failures. They do this for a living. I think the same is true for elite athletes or, you know, celebrity chefs. They're people who are constantly experimenting on the leading edge of what's possible with good reason to believe something might work, but willing to take the disappointment when it doesn't.
Ann Johnson: Yeah. I think that's right and I often say to my team, "Fix the system, not the person." And then explain what that means because if you put even the most talented individuals in to a system that is inherently broken or hasn't been stress tested you set up that person to fail in many ways.
Amy Edmondson: Exactly. I love the term stress tested because our systems are complex and you need -- that's what pilot projects are for, to stress test, to find out where the vulnerabilities are so that when it counts, when the stakes are high, it's far more likely to be reliable.
Ann Johnson: And we're all always moving so fast and I'm in tech so we're always so worried about innovation that people don't often slow down. I equate this to disaster recovery. When I talk to customers I'll say things like "You all have a great plan if there's a natural disaster or fire or something like that. You need to have the same plan for cyber." But guess what? All those plans you need to actually test them to learn where the breaks are.
Amy Edmondson: Exactly right.
Ann Johnson: Well, let's talk more about your research. I want to talk about fear. Right? Because fear is a great human motivator. So can you talk a little bit about how fear of failure or even shame will impact team performance, especially in a field like the one that I'm in, cybersecurity?
Amy Edmondson: Well, there's a real distinction, and I think this is very important, between fear of failure and shame that is private and borne alone and fear of failure that is collective. So it's actually okay for us, right, for our team to be explicitly and eagerly afraid to fail. Right? Because it's discussable. We're talking about it. We're saying, "Wow. We really don't want this to happen. So let's talk. Let's figure it out. Let's -- you know, let's be very open, very vulnerable because the consequences of failure are simply too great." But if I am afraid, and we all are, of failure on my own and it's not discussable and I'm sort of suffering in private with that fear, then I'm putting myself and all of us at risk.
Ann Johnson: I think that makes perfect sense actually, especially in the world that I live in. And shame, you know, often follows failure. You know, we -- the challenge we have in cyber is there's always blame and then somebody gets fired. Right? So if you're in a high visibility role like a CISO or other security leadership roles what can you do to actually encourage openness? And Satya Nadella, the CEO of Microsoft, has an expression he calls embrace the red. You were talking about that. So what can leaders do to reduce shame and embrace the red?
Amy Edmondson: I love that phrase. I welcome the red. Embrace the red. The red is the bit that allows us to get better tomorrow. So it's a treasure. It's a gem whereas the green, you know, okay. That, you know -- we expect to be pretty good at what we do. We're experts. We're well trained. We have good, you know -- we have great organizations in some cases. So it is not that interesting to talk about the green. It's far more valuable to embrace the red. And so I really appreciate that. In healthcare there was long -- really before patient safety and the patient safety movement became so very sort of robust there was a phrase that described the culture which was accuse, blame, criticize. When something went wrong, which unfortunately it often does, it was kind of find a culprit. And so that blame shame culture was dominant. But some leading sort of voices in the patient safety movement realized that was not going to -- it wasn't working. And it wasn't going to help care become safer and more robust. And so they had to shift the culture to one of recognizing this is a complex error prone system. And when you're speaking up you're not blaming. You're saving lives. So it's a reframe. And I think in cybersecurity and in any complex work environment it's absolutely essential to call attention to the complexity, to just remind people meaning put it front and center in your mind again and again and again that things will go wrong. What matters is how we respond and how quickly we respond.
Ann Johnson: Exactly. So how does then a leader -- and let's talk about leadership qualities, but a leader being vulnerable. Right? How does a leader being vulnerable play in how we build trust, how we respond, how we build safety in these teams I keep talking about that are constantly under pressure?
Amy Edmondson: The way I think about vulnerability and this may be a little odd, but I think about vulnerability in the following way. We are all vulnerable in the sense that we don't have a crystal ball. You know, none of us can see the future. None of us can control all of the events in our teams, organizations, or lives. And so vulnerability to me is simply the act of acknowledging that. Right? Acknowledging that we are vulnerable to events and forces outside our control. And naming that is a very powerful invitation to help other people name it as well.
Ann Johnson: I like that. Just leaders speaking up. Right? Security leaders creating space for the conversation.
Amy Edmondson: Right. Right. Create the space. Create the sort of emotional relational space where we can tell the truth to each other.
Ann Johnson: So let's talk about a hard one. Majors who want to get their more junior employees who are coming in -- by the way I'm a big believer and any team I've worked with will tell you I'm a big believer of avoiding group think. You want people from different backgrounds because they're going to come challenge existing ideas. And I think that's where you get your best ideas. So how do you create an environment with that junior person maybe new to the team or younger in career and they can speak up safely even if it contradicts someone who's more senior on the team or someone who's been there longer? How does the leader empower that?
Amy Edmondson: I think leaders that convey a frame that says, "You know, good ideas can come from anywhere" or an observation. You know, a catch can come from anywhere. It can come from the most senior. It can come from the most junior person. So just calling that out as a fact which it is and it may be more likely to come from more experienced people because they're more sensitive to deviations, but truly anyone's voice could make a difference at a crucial time. And simply recognizing that is step one as a leader, but then saying it aloud and often is the next step. So it's as if you can tell -- if you ask a leader, you know, "How often do you really step back to think about the implications of the VUCA world? The volatile uncertain complex ambiguous world that you're operating in. We all know that. Right? We've heard it 1,000 times. But how often do you stop to think about what that really means? Right? It means we don't have a clear line of sight on the future. It means that anything can happen. So I believe that leaders should call attention to the absolute requirement of diverse inputs for making good decisions, but also diverse inputs for ordinary ongoing operations where anyone could help catch and correct.
Ann Johnson: I love that. Let's move to a topic that you call teaming or a term you call teaming. And you've talked about it's a dynamic process. Can you describe teaming, but also talk about how it's going to apply to something like a cybersecurity incident response team that is actually a virtual team and comes together really quickly and it will disband just as fast? How does teaming apply there?
Amy Edmondson: It's in fact that is teaming. That's the very kind of definition of teaming. So when I started talking and writing about teaming some years ago it was because so many of the teams, in quotes, that I was studying say in healthcare and other settings were not the teams of yesterday by which I mean they were not stable well managed teams with stable membership, a clear goal, you know access to resources and coaching that they need. All the team research basically said those are the factors you want to get right to have an effective team. But there I was in healthcare, you know, 24/7, different specialties needed at different times just as you describe, these sort of virtual teams that have to swarm at various times and in some cases it's just they're not even coming together. They're handing things over the boundaries between their disciplines or shifts. So the coordination and the collaboration they're mission critical, but that coordination and collaboration was happening in the absence of a stable team structure. The old theories are not useful in helping those teams do well. They get -- it's more like the -- instead of the hardware of team work it was the software. So it puts the spotlight on communication and coordination and speaking up and being super clear about what I see and what I need and what I need from you and where I'm trying to go and where are you trying to go. Right? So it's -- it puts a heavy premium on communication and awareness of how much our performance depends on our seamless coordination even though, to a certain extent, we've never done anything exactly like this before.
Ann Johnson: Yeah. We talked about how you stress test a plan. I think in order to get to that place if you're doing it suddenly with your teaming contest term you have a plan and you've tested the plan.
Amy Edmondson: Right. Right. And you recognize that the teams or the team work in question will be somewhat virtual. Right? It will be people who absolutely need to work together, but may not have ever done that before, these exact people, but -- and they may not even do it again, but they still need to do it well.
Ann Johnson: All right. So I'm going to be blunt with this question. Can you create a high performing team without psychological safety?
Amy Edmondson: No. And I probably, you know, as a researcher I'm interested in variance. I should probably never be so definitive. But here's how I think about it. There is I like to think two dimensions. Right? Psychological safety which we can think of as the interpersonal climate. Right? How much vulnerability is there? How much confidence is there that if I speak up you won't think well? You know, you won't think badly of me. You won't reject me or humiliate me in some way. Right? That's the interpersonal climate can be low psychological safety all the way up to high psychological safety. That's an important dimension, especially in an uncertain complex interdependent environment. And another dimension is the performance climate. Right? How much do we feel deep ownership of our work and our performance? Right? How much do we care? How motivated are we? How skilled are we? How aware are we of the things that need doing? And how much are we willing to kind of really again feel that ownership? That's the performance climate. High performing teams are those with both, with both a strong performance climate and a strong interpersonal climate.
Ann Johnson: Yeah. I love that. And I think that a lot of leaders need to hear that because you can -- you can drive performance short term. Right? You can. But you can't build sustainable teams in the same way unless you create an environment where everyone actually wants to thrive, they want to be successful.
Amy Edmondson: That's right. And too many leaders, and honestly I've heard this just, you know, hundreds of times, they'll say, "Okay. Psychological safety. I see how that can be important for learning and I'm all for learning. And I just don't want to, you know, scale it back on performance aspirations." And I say, "Yeah. I understand how you might see it that way, but it's actually a faulty mental model." Right? It's these are not two ends of the same spectrum. These are two spectra that we need to lean in to both. We need motivation and strong performance climate and high psychological safety. And that's where good comes. That's where high performance lives, in a volatile uncertain complex ambiguous world.
Ann Johnson: Love that. I love that you keep coming back to that too because it's consistent. People, as you know, need mental models and frameworks that are --
Amy Edmondson: Repetitious. Yes.
Ann Johnson: Yeah. Yeah. That was the word I was seeking.
Amy Edmondson: It's like I really do think our mental models are vestiges from a simpler time, right, from an industrial era where you could be super clear about targets and tasks and you could see the future from here. You know, a year from now, five years from now. You would have something called a five year plan and you could believe in it. That just doesn't -- we don't live in that world anymore.
Ann Johnson: No. We don't. The world is just moving so fast. Speaking of that and the world moving fast, we still have a lot of people who are hybrid, they're remote even though we're getting a little more back in the office. But there have to be different ways that you interact with those teams. So how do you build psychological safety in teams that are not -- they don't have the water cooler conversations? They're not getting to know each other as much on a personal level.
Amy Edmondson: The short answer is more deliberately. You know, more proactively. The fact is that distance, physical distance, as opposed to physical proximity, it raises the hurdle for speaking up. Right? If there's something I want to share with you, I have this little concern about the process or something, am I going to, you know, have to reach out and maybe set up a meeting with you or maybe send you an email, a well crafted email? Or if I just can see you I'll force myself to say, "Okay. Ann, there's something I -- " You know, "It might be nothing, but I'd love to get your thoughts on this." Right? So knowing that the hurdle is just a little bit higher, just, you know, definitionally, then we have to lean in to more structure. Structure is your friend. Right? Okay. Let's have quick stand up meetings or sit down meetings at a regular rhythm to make sure that we just have time to sort of share the little things that might be nothing, but might be something. Or, you know, you have to -- you can design a process or a structure that works for the work that you do or lead. But you have -- you can't just assume it will happen organically because it won't.
Ann Johnson: Yeah. That makes sense. That makes perfect sense. It's intentionality. Right? Instead of just going day to day and assuming organically things are going to happen. How do you recommend leaders measure safety and willingness to fail? We have this thing called connects twice a year, the performance reviews. We just give it a nicer name. But how do you measure without actually making it part of somebody's performance review?
Amy Edmondson: The most important thing that every leader can do today is call attention to context. Do it early. Do it often. And by context I'm particularly interested in what are the stakes. Like what's at stake here for our customers, for our, you know -- for our people, for the world? And how much uncertainty is there? That is my simple way of thinking about context. And there are fortunately, you know, even at Microsoft -- there are contexts where there is a high degree of predictability. And then there are contexts where there's a low degree of predictability. But by calling attention to it you're constantly reframing and upgrading people's mental models so that they understand what they're doing here together to catch and correct. And if I could say a second thing it's ask good questions. Any time you lean in to a good question which is open ended and maybe focused on the issue at hand, but is really truly conveying that you want to hear what someone else has to say or what anyone on the team has to say, sort of who has a different view, what other options are there, who sees it differently, I mean just those sort of really good questions -- what are we missing? How might a competitor approach this? Can you walk me through your logic? Right? If you can get in the habit of inquiry and then deep listening in response you will change the culture of your team and you will change the performance of your team.
Ann Johnson: I love that. And I think that's super easy for people to implement too. So I close out every "Afternoon Cyber Tea" with optimism. I actually call myself a cyber optimist. And I normally ask guests what they're optimistic about in cybersecurity, but given our conversation I'd love to know what you're optimistic about from an organizational standpoint or just for the future.
Amy Edmondson: You know, I'm optimistic that when people take seriously these leadership skills, these interpersonal skills, and start to practice them that they truly can do amazing things together. Right? I think that it's possible for us to have really bold ambitions both for say cybersecurity and for innovation. Right? For really, you know, tough goals like to operate safely all the time or to come up with new exciting products and services. Like those are hard, but I think we can have those visions and those aspirations so long as we're willing to team up and willing to develop the skills we need to team up, accept the failures along the way, and learn fast and repeat. Right? So if we see the journey ahead as a learning journey that fuels my optimism.
Ann Johnson: I love that. We also have because Satya is a big believer in growth mindset we also have learn it all culture is what we call it.
Amy Edmondson: Yes. I love that you're a learn it all culture. I mean I think it's such a -- because it's such a -- you know, it's such a lovely contrast to the know it all that can easily infect tech and other industries.
Ann Johnson: Exactly. Well, Amy, I know you're incredibly busy. You're an expert in your craft. And I could ask you thousands of questions. But I do appreciate you making the time for joining us today. Thanks for sharing your insights with our community.
Amy Edmondson: Well, it was great questions and great to be with you.
Ann Johnson: And many thanks to our audience for tuning in. Join us next time on "Afternoon Cyber Tea." [ Music ] I invited Professor Edmondson on the show because her work on organizational learning, safety, and intelligent failure is incredibly relevant for cybersecurity leaders. She offered a lot of practical advice to help teams feel safe to speak up, take smart risks, and her insights in to how to build resilient cultures are applicable to everyone. It was a fabulous conversation and I know the listeners are going to learn a tremendous amount from this episode. [ Music ]
