Hi, it's Elise Lunen, host of Pulling the Thread. I'm thrilled to welcome today's guest, Amy Edmondson, longtime Harvard Business School professor and the author of Right Kind of Wrong, The Science of Failing Well. Hi, it's Elise Lunen, host of Pulling the Thread. On this show, we pull apart the web in which we all live to understand who we are and why we're here. Pulling the thread is about big questions.
why we do what we do, how we can understand our own experiences within a larger spiritual and historical context, the ways in which we might begin to understand ourselves and each other better, and what's required to heal ourselves and our world. I'll be joined in conversation by luminaries and wise elders, those who have laid tracks in their work and lives to help us bring meaning and understanding to a world that often feels chaotic and overwhelming.
My hope is that these conversations spark moments of resonance and plant tiny seeds of awareness so that we might all collectively learn and grow. You know, it is natural. It's all but hardwired to resist failure, to not want to be blamed. It's an instinct that's very powerful because we don't want to be rejected. We don't want to be thought.
less well of which is why you know the things that I write about and Let's face it, organizations that are truly world-class, whether it's a scientific laboratory or innovation department or perfectly running assembly line, they are not natural. places, right? They're not just left to their own devices. Humans will create places like that. No, they're really hard work, good design, good leadership, kind of daily.
willingness to kind of stretch and grow, you know, independently and together. Short way to put that is it takes effort to create a learning environment. It really does. But it can be done. So says Amy Edmondson, the Novartis Professor of Leadership and Management at the Harvard Business School. Early in her career, she worked as the chief engineer for architect and inventor Buckminster Fuller.
which started her on the road to reimagining how we're all impacted by the world around us. She then became the director of research at Pecos River Learning Centers, where she designed change programs in large companies. And now she's an academic, where she focuses on how teams function and evolve, along with the essential dynamics of collaboration required in environments that are informed by uncertainty and ambiguity.
What sort of environments are those? Well, almost all work environments. A significant point of Amy's research and focus is the necessity of psychological safety in teamwork and innovation. Effectively, How do you create an environment where people feel like they can fail in the right direction, where they're learning and taking risks toward evolution and growth, even when they might not get it right the first few?
or even few hundred times. This is the focus of her latest book, Right Kind of Wrong, The Science of Failing Well. Okay, let's get to our conversation. Ultimately, obviously, the book argues that failure is the bedrock of innovation and progress and the sort of driveshaft of problem solving is a willingness to fail. But for whatever reason...
I'm sure when you tell people what you do, you encounter nothing, but like, oh, what a downer. Right. Absolutely. And, you know, it's in part, if you think about it, I could just as easily walk through the doorway. you know, experimentation or risk or progress or innovation and have the same conversation. And so in a way, it's my fault. I walked through the doorway called failure.
But it works, right? Because don't we, I mean, you write about this, but we all have a negativity bias, right? As much as we want to disown our own failures, we're obsessed with other people's. That's absolutely right. Right. So we do have this negativity, like bad is stronger than good, as I write. And that's well documented in the research literature that's not mine, but other people's. And so that means we're both, you know.
overly sort of interested in and okay about other people's failures, but also overly sensitive to our own. And especially compared to, you know, the things that go well, we discount those and then, you know. are too worried about the ones that don't go well. Yeah. So interesting how just a little bit of objectivity and we see the opportunity to learn and evolve.
Maybe we should define some terms for people. You talk about sort of the spectrum of mistakes, right? And can you give us like the framework in which you think about failure as? Sure. And what the terms are that you think are so important to distinguish? Absolutely. So I'll start by saying. Yes, it's an overly broad term. Let's recognize that part of the challenge here is that failure is a very broad term and it encompasses.
all sorts of things that didn't go as hoped, maybe even expected. And so that applies way too widely to so many things. But the term... Failure is still important because we need to come to terms with failure. So I identify three kinds of failure. You know, the simplest and the kind that, frankly, it's easy to understand why we're allergic to it, I call it a basic failure. And a basic failure is a single cause failure, generally a human error or mistake. led to a failure.
didn't follow the recipe, the cake fell. You didn't study for the exam, you failed the exam. These are basic failures. It's very obvious why they happened. They were in fact. in a very real way, preventable. And so those... Promise you I'm not celebrating basic failures. I am saying they're not shameful. They happen. We should learn from them. We should use them to grow and develop. But I'm not saying let's have more of them. No, quite the opposite. Let's have fewer of them.
The second kind are complex failures that are multi-causal. They're the kind of failures that happen in familiar territory. when a whole bunch of things line up in just the wrong way, that you maybe forgot to set your alarm clock and there was more traffic than you expected.
you know, you hadn't filled up the gas tank, whatever, you know, a bunch of different things that led to you being impossibly late for an important meeting, but not one thing, you know, not one, one simple thing. And those also. are not good news. They're largely preventable. do our very best again to prevent them and now we draw a line in the sand and that's where we get to the third kind which is the intelligent failures and these are the ones that are genuinely
They're genuinely productive and useful because we learn from them. And they are thoughtful forays into new territory. It's an intelligent failure happens. It's essentially the undesired result. of an experiment, you know, where the experiment was worth doing. It thought about it and it seemed like it might get you that result you wanted. And yet you were wrong. Oh, well.
That's okay. And I'm describing it now as if it's a sort of individual thing. But an intelligent failure also describes a clinical trial that fails to demonstrate efficacy for a new cancer drug. new territory, there's literally no way to get that result that you need to move forward on this potential medication without doing that trial.
You wanted it to work. You had good reason to think it would work. But alas, it didn't work. Yeah. And scientists, I mean, you tell a lot of stories in the book about all sorts of failures. It's fun to read. Scientists sort of inherently, based on the craft, right, are much more comfortable with the fact that they will make a hypothesis and get it wrong, what, most of the time?
Most of the time. I mean, it depends on the field. But if you're in, say, a leading edge biological chemistry laboratory. It's very likely that 70 to 80 percent of the experiments you're running end in failure. And that's it's not because you're not good at your craft. It's because you're so good at it that you're taking bold risks. You know, you're you have.
survey the literature. You know what's known. You know what isn't known. You have a hypothesis that this cool thing might work, might be true, might describe nature. You do the experiment. And sometimes you're right, which is pretty darn exciting. You publish in the top journals, but more often you're wrong.
And you train yourself not to be upset by it, right? Because if you're upset by it, you're just not going to last very long as a scientist. But if you recognize it as a worthy effort, you know, this was a good idea to test this. Possibly I'm the first person in the world to test it. And it wasn't right. But now I can also be the first person in the world to do that next experiment with this new knowledge that nobody else yet has, you know, and maybe the next experiment.
informed by this experiment will end in success. What do you think is like the major distinguishing psychological trait there? I know you write about sort of growth mindset and fixed mindset, but it feels like in the land of uncertainty, which... ironically, is science, even though we think of science as so discerning and so discreet and definable and exact.
versus certainty, like where there is certainty, there's a known way to do something. So failure there is a little less acceptable. What do you think it is that's distinguishing? I think it's probably training and practice and just getting your comfort level up with the nature of the sport you're playing.
It's not that scientists are, I mean, most of them, I think, are born different. You know, they just sort of are born and they just say, I don't mind failure. Failure is great. I'm all for it. Right. No, I mean, they just get interested in. some phenomenon you know they love biology they love physics whatever it is and they they get interested in it and they are good at it and they learn and they love the learning and they look around and they realize if i'm going to stay in this game
I've got to be OK with the fact that you can't be right all the time. Like the baseball player who has a, you know, 300 batting average is in the Hall of Fame. That means they are 70 percent of the time missing. Right. But they're right.
right they're the best so so i mean you just realize i mean i always say if and i'm married to a scientist you know how do you get out of bed in the morning and fail 70 percent of the time it's not because you're a different kind of human it's more because you get it you know what that 30 feels like right you know you know it's worth it it's the price you pay to get to that 30% to get to that big discovery and that fantastic publication.
And it's part of the culture, right? This is part of a lab or this is the sea. And that's what I think is so I know you work at HBS, right? You're primarily working with businesses where this isn't necessarily the culture. Can you talk a little bit about psychological safety? And then we can talk a little bit about how lack of psychological safety is so dangerous.
Absolutely. So first of all, I want to say thank you for mentioning that it's the culture because we were describing this almost as an individual property, but it isn't. It's an individual activity and mindset.
oh it takes place in a culture and it's reinforced by and fed by the culture and that's really important and the specific feature of the culture that i think is absolutely critical, which you just asked about, is psychological safety, which I define as Frankly, it's a learning environment, but I define it as a belief that you can take the interpersonal risks of asking questions, raising a concern.
you know sharing a mistake admitting a failure and all the rest it's a recognition that these kinds of learning behaviors are hard for us for people and yet they are less hard in an environment.
that supports it right and that's an environment of psychological safety that basically says you know what interpersonal risks are part of the work we do they're part of learning we've got to make it easy for each other to take them you know we do what we can to make that happen and a healthy scientific lab is definitely a psychologically safe place it's a place where you can quickly ask for help when you're not sure about something or quickly point
to an experiment that failed. You don't want your colleagues to sort of make the same mistake wasting more resources and money. when we already know this one doesn't work. So it's really important to share those things quickly. But psychological safety is valuable for learning, innovation, and problem solving.
And it's not the norm in most work environments, as you described, even by alluding to the business community, broadly speaking. I mean, the general culture of the business community is not... hey failure that's great right maybe there's some verbiage about that and some kind of silicon valley speak about that but most of the time it's you know you make your targets or else you succeed or else
Yes. And low tolerance for mistakes, which you sort of tease out as different or this idea that this cultural idea that somehow allowing mistakes, which are very human. means that you have low standards or that you're running a sloppy operation and that you can engineer perfection. But meanwhile, what you show is in some ways inverse. Can you talk, tell us the story about how you even came to this idea of psychological safety?
Let me first just pick up on what you just said, because I think it is a classic error in... organizations it's understandable that an organization whether they're automotive or aviation or patient care or tech don't want mistakes like none of us want mistakes here's the problem if you just say Let's not have mistakes. And really, you're going to be rewarded and punished and so forth based on whether or not you make mistakes. That doesn't make mistakes go away.
I mean, we can remove some mistakes by just trying harder and so on, but there will be error. So what that really accomplishes. is it makes mistakes go underground. It makes people not speak up about them. It makes them harder to catch incorrect before really bad things happen. And that's... like the worst of all possible worlds right you still have human beings making mistakes but you're less able to do what you need to do to catch correct learn from and you know
prevent the really big ones. And indeed, the way I stumbled into or started to think about psychological safety as an environment that mattered in the modern workplace. was through a study of mistakes. I was part of a team of researchers that was looking at the phenomenon of adverse drug events and medical errors in hospitals. And I was not an expert in medicine in any.
way shape or form but i was an expert in teams and i had the opportunity to assess have people fill out a survey that would assess how well their teams were working as teams and so You know, I was supposed to. My hypothesis was that the better teams would have fewer mistakes and adverse drug events because they're better at teamwork. But what happened was I kind of.
stumbled instead into the insight that the better teams were more open and willing to talk about error. And so that it actually turned out that It's not so easy. I mean, this is obvious in retrospect, but it's not so easy to measure error rates in many workplaces. I mean, some are because they're really objective. You can go to the end of the assembly line and count up the errors. But many errors that happen.
in workplaces are hidden and many of them don't really cause real harm. I mean, they had the potential to cause harm, but they don't automatically cause harm. The act of getting people to talk about errors is really, to encourage people to talk about errors is not easy, but it is important. Well, and it's very... In a workplace without psychological safety, where there's a lot of blame, shame, etc., it's a threat to someone's identity. It's a threat to their livelihood. It is understandable.
why someone would not confess to messing up, right? Right. The last thing I would be saying is that they're, you know, they're bad and wrong for not speaking up. No, their environment was perfectly... tuned into or you know let me say it a different way their environment was perfectly designed to have them not speak up about error right and they're and they are products of that environment and it would be both unscientific and maybe even unethical to expect them to be heroes.
In the sense, expect them to do things that they firmly believe are not in their best interest. Instead, you have to create an environment where they understand that is in their best interest because it will be appreciated. There will be. gratitude for the fact that you spoke up.
I'm here with our executive producer, Stephanie Karayuki. Hi, Laleh. So good to be here. Stephanie and I, we've been talking a lot about the new twists and turns the show is taking. The team has been working really hard at honing in on the heart of the show. Stories from women that make you feel something. Stories that change your perspective, that have depth, and sometimes stories that just make you laugh really hard.
A big part of Women Who Travel has always been to claim space that has traditionally been taken up by male voices. And that's still true today. And our mission moving forward with this show is to bring to life the travel experiences that you might not have heard while helping you figure out where to go next.
So when you listen to Women Who Travel, you not only support our show, you support the vision behind it to make travel accessible and exciting for all. New episodes come out every Thursday. So make sure you follow Women Who Travel wherever you get your podcasts. So can you talk a little bit about how, you know, you write about how blaming is such a natural instinct. We have such, you know, you write about the three-year-old in the car where the father like hits.
you know, messes up and the mirror comes off and the three are like, it's not my fault. It wasn't me. And I didn't do it, Papa. Yeah, I didn't do it. But that's so that sort of like, I'm in trouble. I'm in trouble. Instinct is so strong. It's so human. So how do you see people successfully keeping that threat system low? Yes, it's such an important question because you are right. It is. You know, it is natural. It's all but hardwired to resist.
failure to not want to be blamed you know it's an instinct that's very powerful because we don't want to be rejected we don't want to be thought less well of which is why you know the things that i write about and Let's face it, organizations that are truly world-class, whether it's a scientific laboratory or innovation department or perfectly running assembly line, they are not natural.
places right they're not just ah left to their own devices humans will create places like that no they're really hard work good design good leadership kind of daily willingness to kind of stretch and grow, you know, independently and together. And so it short way to put that is it takes effort to create a learning environment. It really does.
But it can be done. We can see places where people realize, hey, I'm a fallible human being working among other fallible human beings. Chances are pretty good that a few things will go wrong that we really didn't.
expect or want but we're going to stay alert to it we're going to you know do our best because we recognize that we've been sort of reprogrammed to say yeah i'm a fellow human being so are you things will go wrong great and then also like we were talking about with the scientists we can teach people and help people to realize that on the leading edge of any endeavor, whether it's scientific or culinary or athletic, if you're really on that leading edge,
you're in new territory, you're discovering things, you're trying things that maybe haven't been done before. And any such effort will bring risk, you know, bring risk that it might not work. also brings the possibility that it might work. So in a way, you're willing to reorient your thinking to say, I'm willing to take that risk, right? Because of the potential upside, I'm willing to live with the downside.
of it not working out or of me being wrong. Yeah. It's interesting. I think you were writing about X at Google and whoever you were profiling was saying that if he had to do layoffs. that he would lay off the people who had never failed. Is that accurate? Basically, yeah. So it's Astro Teller, and he was running X, and, you know, he understood, you know, there was sort of, it wasn't...
Recently, it was a couple of years ago, but it was sort of one of those times, not a couple, before the COVID, one of those periods where people were, for whatever reason, a little bit anxious about layoffs. And they expressed that as, yeah, but how am I supposed to take risks now when there could be these layoffs? And he said, well, that's exactly when you...
should be doing it because it's not really adding value to us. I mean, we're an innovation factory. This is different than if you're maybe, you know, running a nuclear power plant.
if you're not failing like if you're not risking and trying wild new things some of which don't work you're probably not adding much value around here right i mean so it's sort of you know it's flipping on its head our natural normal mindset which is probably good to just lay low so that nobody notices me if I fail rather than no no if I'm not noticing you failing you're not
the highest value employee that we have. No, it's interesting. But then flipping, you know, you mentioned a nuclear factory, but flipping it into not necessarily nuclear high stakes, but you write a lot about Toyota. or mechanisms within companies for allowing employees to report errors anonymously or not. But just that idea of like, what's it called? The end on board. Right. The end on chord, which is the end on chord is both.
you know, a practical tool and a symbolic message, because the end on cord is a cord that any team member on the front lines of the assembly plant can pull if he or she notices any.
potential not just a problem but like a potential problem and think about how important that difference is right because it's one thing to say you're smart enough to detect a problem please let us know right away but what they're saying is if you hypothesize that something might not be quite right we want to hear from you right we really do so you pull that cord
Now many people have heard the end on court and they think, oh, that stops the line. It doesn't stop it right away. It basically calls a team leader over who will then help you diagnose whether the thing you noticed. is really a problem or not. And 11 out of 12 times, at least at one point in time,
It wasn't a problem, right? So 11 out of 12 times that the line doesn't stop. It just keeps going. But think about the symbol of that. It's visible. It's right there. And it's basically saying all day long, we want to hear from you. We value your brain. We believe you are alert and seeing things and we need you. This is a team sport. That's what it's saying. And so in well-run organizations and learning organizations, there are mechanisms to just make it.
easier, right? Make it easier for people to do the hard things of speaking up about error or questions or concerns. And you talk about sort of like how you can also create systems like they do this in airlines, right? Where you can... not lodge anonymous complaints, but anonymously report errors of your own or other people without attribution or blame? Yes. I mean, anonymity is kind of a double-edged sword because...
it definitely makes it easier for people to report. On the other hand, it subtly sends the message that it might not be safe. So I think it's still worth doing, but you have to realize you're doing it with a potential. risk, which is to convey, yeah, it really is dangerous to do this.
with your name on it but i think it's a step it's like training wheels on a bicycle it helps you and maybe it collects things that would otherwise be missed i mean it certainly does collect things that would otherwise be missed the gold standard would be that everybody just knows
knows that's how we roll, right? We speak up. Yeah. And that it's okay. And it's okay. And no one ever got fired for it, right? And no one ever got, you know, reprimanded for it. In fact, the opposite, they'd get praised for it or thanked for it. It is really hard to imagine doing, right? Right. Owning our own fallibility. I was thinking too, this was fascinating to me, but you write about the fundamental attribution error.
And this idea that you write, this is Stanford psychologist Lee Ross, when we see others fail, we spontaneously view their personality or ability as the cause. It's almost amusing to realize that we do exactly the opposite in explaining our own failures, spontaneously seeing external factors as the cause. And this is true, right?
It's true and hilarious if you think about it. Now, when social psychological findings are true, it doesn't mean it's like 100% of the time it's this. It means there's a meaningful difference in our sense making about other people's failure and our own. and part of it i think is you know deep down it's so frightening for us to think it might
It's our fault, right? Because that feels like a kind of death. So I have to say, yeah, yeah, yeah. But the situation, right? The context, there was too much traffic, you know, versus. OK, you know, I contributed to it and the situation contributed to it. And the same is true for you. And, you know, if I see some other, you know, the failure in another person, I'm realizing that, yep, it's some of it's them and some of it's the situation and it's.
A complex combination. Complex too. How do you break that tendency in people? Is it just awareness? I think awareness, clearly awareness is an important step. How much awareness gets you there is debatable. I don't think it's enough. I think it's awareness plus practice. Once you become aware of the fundamental...
attribution error, it's hard not to see it. You know, you trip on the sidewalk and you think, oh, anyone would have done that, right? It's like, it's a bumpy sidewalk rather than like, oh, I'm clumsy, you know, but it's both, right? Yeah, it's a, it was a little thing there. But it's both. And it's always both in a way. And so so part of it is just having that awareness. And then I think the second part is just practicing the new.
thinking, the new, more productive, more learning-oriented thinking. No, I've been thinking a lot about, and you write about sort of this idea of fear and how fear inhibits learning. which I think makes a lot of sense, right? Like to anyone who's listening and you think about these instances of failure and just talking about it excites my nervous system, right? And makes me feel...
like a similar threat to my identity, particularly as a woman who aims for perfection in everything that I do. But of course, right? But this idea of, I don't know if you've ever heard of... conscious leadership group or their framework of being above or below the line. But it is this idea that fear, that as humans, our tendency is to... be below the line most of the time, 90 something percent of the time. And when we're below the line, we see the world happening to us.
And we are looking at where to place blame. And they talk a lot about Cartman's drama triangle and sort of figuring out where you are in that triangle. But that when you can get above the line, when you can sort of corral. I don't even corral your fear, but sort of you get above the line. Yeah. Yeah. You put it in a pen and that's. From a distance, right? From above. It's similar to what Ronnie Heifetz at the Kennedy School at Harvard calls, you know, getting on the balcony.
It uses that phrase. It seems like the same thing cognitively, you know, neurologically, you're just trying to get a little bit, not just distance, but like above the kind of distance that. You're looking at it from above because you can see the bigger picture and you can look at it more dispassionately. Dispassionately, dispersonally, and also as they talk about where below the line.
sort of life the world is happening to you but above the line it's happening by you and through you and that you're co-creating your reality you're not blaming you're seeing problems and obstacles as allies for your own learning. It's sort of another frame for what you're talking about, which is similar.
Yes. And that to do it, like you talk about at some point, psychological safety being, I'm going to mess up the word, but essentially express through the culture, right? It's not a personal quality. Right. An emergent property of the group. An emergent property. Yes. And so how do you in that way create sort of above the line emergent? psychological safety environments where learning and is the ally.
That's the holy grail. That's the big question. But I think the word you used just a few minutes ago, co-creating, is a really important part of it. Like, in a way, you have to do it yourself, but you can't do it alone, right? Like, you have to do your part. But when you're with other people, and you're sort of working this through aloud, right? Yeah, this went wrong. Here's some things I contributed. We work on getting ourselves back up above the line together. I earnestly believe.
that this is easier to do with supportive others than by yourself. Like you can really get stuck, right? You can get stuck below the line. You just don't see any doorways out, but a good friend or a good colleague will kind of open the door for you and say, well, come on, let's think about it this way.
Just at that moment when it was sort of hardest for you to get out of the fear zone and into that sort of learning zone. But they opened the door for you and vice versa. I think the way we do this, we start talking about it. We keep, we sort of support each other and hold each other accountable for getting into more. productive, learning-oriented thinking and less in the, you know, it's happening to me and then I'm powerless, right? You're the victim.
that way and that's not a good place to be or a good place to stay even when you know even when it's like 100% true, as in the case of Viktor Frankl and his magnificent memoir, Man's Search for Meaning, where he is literally the victim in Auschwitz, you know, death camp. he manages to get himself above the line, right? He manages to, I'm going to think about this differently. I'm going to think about the incredible bravery I see every day.
among my colleagues and i'm going to remember their stories and i'm going to share them when i get out right so that's a kind of a deliberate shift from something where you literally is being done to you it's a horrific below the line place but you decide internally i'm going above he of course didn't use those terms he didn't have those terms but yeah It's empowering, to say the least. Yeah. And it's this difference between...
being a victim and victim consciousness or sort of attaching to that as a consciousness. Like you can be a victim and be like, I refuse to participate in this. sort of consciousness or this energetic field. So both things can be true simultaneously. It's interesting too, like...
to the conversation in the book about checklists and Atul Gawande and sort of this idea that affixing as much certainty as we can and how helpful that is, right? Like there's no reason to be reinventing scripts in the fly when we have basic checklists. that we can follow. But then also... To recognize that these systems, you can mechanize it as much as possible, but it's still alive and there are still all of these other human forces at play. And so it has to be co-creative.
You can't just routinize it without thinking about it or participating with it. It won't do it by itself. The checklist won't ensure the safe flight or the safe surgery. By itself, it takes conscious engagement with the human being. The human mind has to engage with it consciously to make it work. And, you know, it's so interesting because when checklists.
and protocols first came into medicine first you know the physicians were like go away you're trying to dumb down medicine you're trying to take away my autonomy as a clinician as a professional and fortunately that thinking was able to be changed from no. In fact, this is just an incredibly helpful support tool so that your big brain doesn't have to be tied up trying to remember all the items on the list.
Those are right there. It's free to make judgments. It's free to notice things that you might otherwise miss. It's free to do its highly educated, highly important job. So you had to get people to reframe what the checklist was from the boss to the servant. Right. And then to imagine it as sort of like an iterative.
process that could potentially be continually improved and or rethought or reengineered it doesn't mean that you can like unhook your mind if anything it's just like giving you the train tracks but it's interesting to think about that in the context of, I mean, I don't even know how to bring it to AI, but you think about, as a writer, I think about AI and I'm like, I wonder, like, does this...
reduce the need for so many copywriters or some of the more not perfunctory because I love a copy editor. I love like someone who really understands grammar. They're like my favorite brains and favorite people. But being a writer, it's like everyone's favorite activity. I worked at a magazine called Time Out New York. And it was just like most of the letters to the editor were like pointing out typos. People love. to do that sort of check.
There's something very satisfying about it. Just a little victory, right? So it shows you how good we are at error detection, right? Yeah. But yes, you know, I'm wondering about that too, right? Because how is this possibly? a good servant, right? Is AI possibly a good for a writer? Or is it a, okay, park your brain at the door and let it do your work for you and it'll be, you know, good enough. And is it just gonna like escalate?
factual errors it's sort of only as good as the consciousness that creates it right or the information that feeds it even though now like ai can cope like code programmed ai can like program itself. I don't quite know. It breaks my brain. But you think about it as either escalating failure points or...
being a process to find failure. I don't know. Do you have any idea? AI breaks my brain. Okay. I know it breaks mine too. And so we won't even try to speculate, but there's no question that uncertainty has just gone wildly up. We talked a bit about the negativity bias, but just to go back there for a minute. And do you think, I mean, you write about Daniel Kahneman and this idea that...
We're so attenuated for that. It's like a threat reduction, right, for our survival. Is that enough? Do you think that explains? the negativity bias and why we're so scared? I think so. You know, I think that's probably a philosophical question, but it makes sense to me to say that we would have a negativity bias because the risk of... you know, real harm outweighs the upside of potential gain, especially in prehistoric times, if you could lose your life by, you know, not noticing.
some threatening creature or situation, you'd want to be very tuned in to those kinds of situations so you could stay alive and, you know, reproduce and then we're all here. Whereas, you know, the upside maybe would have been smaller and less life-changing. You found a nice peach or you didn't. Yeah. No, I think that makes sense. But it's interesting to think about that as sort of one of our primary programs that we're running. We have a survival bias.
that biases us to be just kind of threat sensitive and maybe less able to just be like joy sensitive. Yeah. Have you met anyone in all of your sort of work and as you've gathered these stories from so many different fields? And have you met, even if it's only in pages, someone who has sort of successfully overridden that tendency? I think, yes. I don't think it's the case that anyone, well, certainly anyone I've met.
or studied would be sort of 100% let me use the word enlightened you know learning oriented all the time but I have met people who are just more consistently able to kind of catch and correct their thinking errors, if you will. I think it's fair to say I didn't know him.
super well, but I think it's fair to say Maxie Maltzby was one of them. African-American psychiatrist who studied under Albert Ellis, studied cognitive behavior therapy, then sort of rebranded and tweaked it. Something he called rational.
behavior therapy, which was not meant for main idea was it wasn't for, you know, people who are really struggling and who are in a clinical situation with a psychiatrist, but for sort of all of us in the more our day to day lives who are guilty of unhealthy, unhelpful thinking about the various things.
that happened in our lives. And he believed we could do better. Now, I have to say, I have met Very few people with possible exception of Chris Erdger's who were as rational as Maxie, you know, and just almost they had the capacity to just be passionate about. Like things that go wrong. Yeah. Right. And they, they can't have been born that way. Right. They had to have trained themselves and then gotten good at it. You know, the way, you know, an elite athlete is just.
so much better than a normal athlete at doing what they do because they practice. And it was called, was it RBT? RBT, Rational Behavior Therapy. Yeah. And can you explain like how it interrupts that? pattern or how I think you gave the example, the story of the... guy who was learning how to play bridge yes he's a high school student who's very you know very good student and good athlete and so on and it's minneapolis area it's cold winters and i guess his friends were liking the game
of bridge and they asked Jeffrey to you know join them and of course there's that challenge of trying to learn something new you don't especially something hard like the game of bridge he's not good at it right away and it really He was really frustrated by what he called his mistakes, which technically were mistakes. And so he was going to quit because it just wasn't fun. And he was certainly not making it fun for his friends either. And then he just happened to take a course.
school on this rational behavior therapy and not because he was connecting it to the bridge game, but just thought it sounded interesting. And it taught him that, you know, when you're frustrated, when you encounter a problem or a mistake or a challenge, you have to kind of pause.
And challenge your automatic thinking about the situation. Like, this is really bad. I made a mistake. It's, you know, I'm stupid or it shouldn't have happened and say, well, wait a minute, let's challenge that thinking, which is very spontaneous and, you know, almost just. happening. It's not like you're generating it deliberately. But anyway, pause.
and challenge that thinking and say, wait, this is a brand new game that's incredibly sophisticated. I've only just started learning. There's no reason on earth to imagine I should be an expert at it yet. Maybe the, you know, maybe the mistakes that I make are just bits.
from which I have to learn to get better. And, you know, it seems this is so everything I'm saying right now is so obvious, right? But we don't spontaneously react to our shortcomings and mistakes that way. But he just was applying the lessons from the course. to the bridge situation and changed how he thought about it. He reframed the thing from evidence, it's a sort of growth mindset, evidence that I'm no good to...
evidence that this is in fact hard and I'm a learner, a beginner. And that, of course, made it more fun and made the missteps just useful data from which to learn. And gradually he got... better at it enjoyed it more his friends enjoyed him more and so you know it's a simple example but i'm sure you know all our listeners have stories or situations in their life where you know you get frustrated you want to give up you assume it just
You make that conclusion that you're just not good at this, when in fact, you're a beginner. Yeah. No, you're a failure. Meanwhile, no, I'm learning. I mean, it is true. You said the word fun. And I think. Even just thinking about the different contexts that you discuss failure in the book from the...
automotive assembly line to Google X. It's like when there's an element, and we talked about this idea of co-creation, when you can get enough distance to be like, this is fun. This is a game. This is a learning opportunity. iterative, like what's going to happen. That's fun. Right. Rather than checking the boxes to ensure a certain outcome. I'm in threat. I am scared.
It's a completely different mindset. Yeah, that's true. Well, thank you for your book and all of your work. You can put almost anything into the context. It's as I'm it's obviously kept you busy for a career, right? Yes, it has. And can I just say thank you for reading it so.
thoughtfully. I mean, what a joy it is. Of course. And you're saying, oh, but you wrote this and you said that and there's this story and remind us of that one. I'm like, that's what a writer longs for, as you know, is a good reader. Well, no, my pleasure. I always love a book about case studies, particularly when it spans industries. And she tells a lot of great stories from her career.
including the story of the creation of Veuve Clicquot, the champagne, and the number of failures endured while that was coming to market. She really can't overstate nor can I the importance of psychological safety, which I think is one of those ephemeral concepts that's essential to getting everyone out of a threat response. It really, really is.
She writes about this as psychological safety, which means believing it's safe to speak up, is enormously important for feeling a sense of belonging. But belonging is more personal. While psychological safety is more collective, it is conceptualized in research studies as an emergent property of a group. And I think it is co-created by individuals and the groups to which they wish to belong.
So may we all think about the psychological safety that we're creating for each other and therefore for ourselves. To me, it seems like one of those concepts that's contagious. A group has it collectively. Or they don't, but may it be the positive type of contagion. If you liked today's episode, please rate and review. and tell a friend. You can find show notes and full transcripts of the episodes at eliselunan.com. While there, please sign up for my sub stack.
I send a short note every Wednesday about topics that are aligned with this show and a deeper dive most Sundays. Or follow me on Instagram at Elise Lunen. And finally, if you haven't already, please consider picking up a copy of my New York Times bestselling book, On Our Best Behavior, The Seven Deadly Sins and the Price Women Pay to Be Good, available wherever you get your books.
It's an exploration of how women have been conditioned for goodness, men for power, and all the ways we've been programmed to police ourselves and each other according to these cultural ideas of what it is to be a good woman. And also like to give a huge thank you to my sponsors who make the show possible. Please support them the way they support this podcast.
This has been a presentation of Cadence 13 Studio. If you enjoyed this episode, please listen, rate, review, and follow Pulling the Thread. Available now for free on Apple Podcasts, Spotify, Odyssey, or wherever you get your podcasts. I want to give a shout out to Phil Svitek, Lauren LaGrasso, Mary-Kate McDonough, Allie Brockman, and the entire Cadence 13 team for producing these episodes, and to Valero Duvall for my key art. Take care of yourselves. I'll see you next week.