Podcast Introduction
00:00:06
Speaker
Welcome to Critical Matters, a sound podcast covering a broad range of topics related to the practice of intensive care medicine.
00:00:14
Speaker
Sound provides comprehensive critical care programs to hospitals across the country.
00:00:19
Speaker
To learn more about our programs and career opportunities, visit www.soundphysicians.com.
00:00:26
Speaker
And now, your host, Dr. Sergio Zanotti.
Learning from Failure
00:00:32
Speaker
To err is human, yet in healthcare and in critical care, we still have an unhealthy relationship with failure.
00:00:38
Speaker
In today's episode of the podcast, we will discuss the science behind learning to fail.
00:00:42
Speaker
In other words, failing well.
00:00:44
Speaker
We will discuss our flawed relationship to failure, how to better understand failure, and more importantly, how to move from a culture of blame, who did it, to a culture of learning what happened.
Guest Introduction: Amy Edmondson
00:00:55
Speaker
Our guest is Amy Edmondson, the Novartis Professor of Leadership and Management at the Harvard Business School.
00:01:00
Speaker
For the last two and a half decades, she has studied the elements of high-performing teams in complex environments.
00:01:05
Speaker
She has coined the term psychological safety and has made critical insights into teaming, learning from failure, and innovation.
00:01:12
Speaker
She is the author of several books, multiple academic papers, and a regular contributor to the Harvard Business Review.
00:01:18
Speaker
Her latest book, Right Kind of Wrong, The Science of Failing Well, was released in September of this year on hardback.
00:01:24
Speaker
I am truly honored and grateful to have her back on the podcast.
00:01:27
Speaker
Amy, welcome back to Critical Matters.
00:01:29
Speaker
Thank you so much for having me back.
Failure Terminology in Healthcare
00:01:31
Speaker
So I would like to start with a rapid fire of just some terms that are often thrown around when we talk about failure and just get your reactions.
00:01:42
Speaker
One of them that we've heard over and over again in the business world and in the startup world is fail often and fail fast.
00:01:51
Speaker
My reaction is that is too indiscriminate.
00:01:57
Speaker
That is appropriate advice for some contexts, but not for most contexts.
00:02:04
Speaker
Failure is not an option.
00:02:07
Speaker
Again, failure is not an option is an interesting one because it's, I believe it's trying to convey the idea that we absolutely must do our very best because of what's at stake.
00:02:20
Speaker
But it can inadvertently drive truth telling underground.
00:02:25
Speaker
So it can backfire.
00:02:27
Speaker
So I think whenever we want to use that term, failure is not an option, which I think is our aspiration in very high stakes situations.
00:02:35
Speaker
We want to make sure we're crystal clear that we're saying because we don't want to fail, we need to hear from you.
00:02:44
Speaker
And last, we don't fail.
00:02:45
Speaker
We are winners versus we don't fail.
00:02:51
Speaker
Again, lovely sentiment, and it needs to be a bit more nuanced.
00:02:56
Speaker
There are times when we fail.
00:02:59
Speaker
There are times when we fail badly, and many of those are, as human beings, are preventable.
00:03:07
Speaker
So, yes, we should learn from all failures, and we should be quite discriminating about the kinds of failures that
00:03:17
Speaker
that we're proud to have experienced and those which we are really going to try very hard to never have happen again.
00:03:26
Speaker
And before we go into our troubled relationship with failure, especially in healthcare, as a physician, I mean, obviously we deal with failure on a daily basis, yet we still deal with it in a very, I believe, unhealthy way.
00:03:41
Speaker
Could you define failure and how is it different from an error?
00:03:45
Speaker
Well, failure is a larger, broader term.
00:03:47
Speaker
Failure is an undesired result.
00:03:51
Speaker
An error is a deviation from a known practice, a known process.
00:03:58
Speaker
I often think of the easy way to talk about this.
00:04:02
Speaker
There is a recipe, but we messed it up, didn't use it properly.
00:04:08
Speaker
So a mistake, the concept of mistake is
00:04:12
Speaker
only exists when there is knowledge about how to get the result you want.
00:04:17
Speaker
So many failures are caused by mistakes, but many failures are not at all related to mistakes.
00:04:26
Speaker
There are undesired results in new territory.
00:04:30
Speaker
And I think that that's a very important distinction to get us started, especially when we're talking to healthcare providers, because we have been taught that all failures are wrong.
00:04:41
Speaker
And we talk about failure and error and mistake almost in the same context and are unable to talk about it with maybe a better framework, which I hope that you will provide us today.
00:04:53
Speaker
So why don't we go ahead, Amy?
00:04:55
Speaker
I just my initial reaction, my immediate reaction to that wonderful point about health care is that I think it's easy to lose track of the historical perspective.
00:05:07
Speaker
The remarkable work that you do today in medicine and health care.
00:05:15
Speaker
all exists because of a willingness to experiment, again, I think almost always in cautious, thoughtful ways throughout the history of medicine.
00:05:24
Speaker
If people had not been willing to experiment in the early days, for example, of cardiac surgery in the 50s, we would not have that medical miracle today.
Human Aversion to Failure
00:05:37
Speaker
Now, can we talk about our troubled relationship with failure and maybe start with our aversion to failure?
00:05:44
Speaker
How does that impact our learning?
00:05:47
Speaker
You know, our aversion to failure as human beings is largely our aversion to our own failures, right?
00:05:53
Speaker
It turns out that we have less aversion to other people's failures.
00:05:57
Speaker
In fact, sometimes they give us an unhealthy experience.
00:06:01
Speaker
sort of sense of relief or worse.
00:06:04
Speaker
But our aversion to failure, I believe, is quite instinctive.
00:06:08
Speaker
It's an emotional reaction because we want to be successful.
00:06:13
Speaker
We don't want to fail.
00:06:14
Speaker
We want to look good, not bad.
00:06:17
Speaker
And so I think it's both somewhat automatic, but also well-learned through our childhood and our socialization that lead us to
00:06:29
Speaker
really want to distance ourselves from failure.
00:06:33
Speaker
And what do we commonly get wrong about failure, Amy?
00:06:36
Speaker
One of the things I think you clearly outlined already is that failure is not always bad.
00:06:42
Speaker
That's one of them.
00:06:43
Speaker
But what else do we get wrong about failure?
00:06:46
Speaker
Well, I think what we get wrong about failure is a sense that it has to be either or.
00:06:52
Speaker
Either we're embracing the fail fast rhetoric or we're saying, no, failure is off limits.
00:06:59
Speaker
Failure is not an option.
00:07:01
Speaker
And you sort of have to take sides.
00:07:03
Speaker
And then some people will say, well, there's a balance.
00:07:06
Speaker
I don't think it's a balance.
00:07:08
Speaker
I think it's a balance.
00:07:10
Speaker
very specific, you know, context-specific set of principles so that when you are in, say, a scientific laboratory, if you're experiencing no failures, you are not going to publish very well as a scientist.
00:07:25
Speaker
In other words, if you're on the leading edge of any field, you are experiencing very likely far more failures than successes because your experiments haven't been done before by anyone in the world.
00:07:38
Speaker
You might have a very good hypothesis and Mother Nature doesn't agree with you.
00:07:42
Speaker
So in that context, that would be a context where you must be failing or you won't be succeeding.
00:07:49
Speaker
Similarly, let's say you're an elite gymnast.
00:07:52
Speaker
If you're only doing safe moves that you can do in your sleep, you're not going to make it to the Olympics.
00:07:58
Speaker
If you're trying harder and harder things, which involves failing along the way to getting them just right,
00:08:04
Speaker
you will have a better chance of ultimate success.
00:08:08
Speaker
So we have to be thoughtful about failure.
00:08:10
Speaker
We have to say, well, under what conditions is failure truly on the path toward success?
00:08:17
Speaker
And under what conditions are we going to do our very best individually and as groups to prevent them?
00:08:27
Speaker
And I think another aspect of failure that people might not appreciate is that it's not that easy to learn from failure.
00:08:35
Speaker
We think it is, but I think we'll talk about it more.
00:08:38
Speaker
It's a bit complicated.
00:08:42
Speaker
It's, you know, I think oftentimes our learnings are quite superficial.
00:08:47
Speaker
It's sort of, okay, I don't want to do that again.
00:08:50
Speaker
I just move forward and don't look back.
00:08:52
Speaker
We often don't take the time...
00:08:55
Speaker
to really look at what happened, what was my thinking, what were the actions, why were we wrong, and what does that tell us about what we should be doing going forward.
00:09:10
Speaker
So I don't mean to imply a lengthy, lengthy analysis, but a thoughtful one that is clear-eyed and willing to do a little bit of that hard work.
00:09:23
Speaker
In medicine, historically, we have examined failure through the very traditional and the toxic M&Ms, morbidity and mortalities.
00:09:34
Speaker
And I think some teams are moving in the right direction.
00:09:38
Speaker
But obviously, having trained before they were in our sets and in a different culture,
00:09:46
Speaker
it seemed that we had that hero mentality that we would just put our head down and say, I will do better next time.
00:09:51
Speaker
But there was really no learning there.
00:09:53
Speaker
Could you talk a little bit about the failure and the blame culture?
00:09:56
Speaker
Yes, I do think, you know, it's, it's, um, and Eminem is a good example of this.
00:10:02
Speaker
The, the, um, imagine the following, right?
00:10:07
Speaker
How many failures in, let's say, your medical center, and I would say that to any listener, are caused by people engaging in what we might call blameworthy acts, where they really woke up, came to work impaired, or just mailed it in, did sloppy work on purpose, or worse, deliberately set out to sabotage a case, right?
00:10:34
Speaker
And most people's answers will be, well,
00:10:37
Speaker
I mean, when something like that happens, it's very, very rare to say, OK, I'm with you on that.
00:10:44
Speaker
And then I'll say, well, what percent of.
00:10:47
Speaker
failures does your organization, your medical center treat as if they were caused by blameworthy acts?
00:10:53
Speaker
And then either, you know, a gasp or a laugh and people will say, well, most of them.
00:10:59
Speaker
And to me, the key to an effective M&M meeting is that we're making that thoughtful distinction.
00:11:05
Speaker
It's not a blame and shame meeting.
00:11:07
Speaker
It's a learning meeting.
00:11:09
Speaker
We recognize that tragically, sometimes things do go wrong.
00:11:13
Speaker
And our aspiration is simply to learn everything we can from it to prevent it from happening again.
00:11:20
Speaker
Not to accuse, blame, criticize, but to truly learn, as the scientists you all are, to really learn what happened and how to do better.
00:11:32
Speaker
And to realize that sometimes we have to take care of the clinicians as well as
00:11:36
Speaker
I mean, if you go through something like that, like, you know, a fatal event, especially, it can be quite traumatic.
00:11:47
Speaker
And so in our aspiration to prevent them again from happening, we have to take care of everybody involved as best we can.
00:12:00
Speaker
That is a great segue to talking about the types of failure.
00:12:04
Speaker
And then we can talk about that some of these types are, like you said, blameworthy and some are praiseworthy.
00:12:10
Speaker
But could you give us an overview of the three types of failure?
00:12:15
Speaker
So the three types of failure I identify are basic, complex and intelligent.
00:12:21
Speaker
And just to get out ahead of this, only intelligent is what I'll call the right kind of wrong, the praiseworthy
Three Types of Failure
00:12:30
Speaker
And basic failures are single cause failures.
00:12:33
Speaker
They're caused by human error.
00:12:35
Speaker
They're in familiar territory.
00:12:37
Speaker
We had knowledge about how to achieve a result, whether it's a chocolate cake or a medical procedure.
00:12:46
Speaker
The knowledge exists and error intervened and led to a failure.
00:12:53
Speaker
Those are obviously not celebratory moments.
00:12:56
Speaker
Those are learning moments to be sure.
00:13:00
Speaker
We can prevent all of those.
00:13:02
Speaker
And not because, I'm not saying we can prevent all human error.
00:13:08
Speaker
But in medicine, usually there's at least a space between an error and an outcome.
00:13:17
Speaker
So when we're at our best, we can catch incorrect errors before anyone is harmed.
00:13:23
Speaker
We can not deliver the erroneous dose that was just written.
00:13:28
Speaker
We can quickly give an antidote if it gets that far.
00:13:36
Speaker
So that's basic failure.
00:13:37
Speaker
Single cause, bad mistake leads to a failure.
00:13:41
Speaker
Complex failures are, of course, the real beast in medicine and healthcare, which are the Swiss cheese kind of failure.
00:13:48
Speaker
And I define them as they're multi-causal,
00:13:52
Speaker
A handful of factors line up in just the wrong way to create the failure.
00:13:57
Speaker
And any one of those factors on their own would not lead to the bad outcome, right?
00:14:03
Speaker
It's the perfect storm.
00:14:03
Speaker
They had to come together in just the wrong way to make the failure, to allow the failure to slip through.
00:14:10
Speaker
Those two are preventable.
00:14:12
Speaker
And quite often, and to me quite tragically, many complex failures could have been prevented had people felt psychologically safe enough to speak up when they were
00:14:22
Speaker
you know, not convinced that something was wrong, but just curious about whether something was wrong.
00:14:28
Speaker
When they glimpse what I would call an ambiguous threat or an ambiguous moment and they just are afraid to ask for help or to question a dose or what have you.
00:14:42
Speaker
So complex failures are multi-causal.
00:14:44
Speaker
That makes them pernicious, but it also gives us many, many opportunities to catch incorrect.
00:14:50
Speaker
And then finally, intelligent failures are the still undesired results of novel forays into new territory.
00:14:58
Speaker
So technically, they're experiments, whether it's an experiment in your personal life or an experiment in a laboratory.
00:15:04
Speaker
Intelligent failure is the outcome that happens when your hypothesis was wrong in new territory.
00:15:16
Speaker
And I think that this framework is extremely important for our clinicians to understand because it gives them a better way of talking about failures among the team.
00:15:28
Speaker
And what I wanted just to comment from the clinical perspective, Amy, is that the names basic, complex, and intelligent work.
00:15:37
Speaker
are really based on the processes that are involved in terms of thinking and behaviors and not necessarily on the outcome or the stakes.
00:15:48
Speaker
So I can think of a basic failure would be we amputated the wrong limb.
00:15:53
Speaker
That is devastating for a patient, but it's a basic failure that should be prevented.
00:15:59
Speaker
I'll just interrupt and say yes.
00:16:02
Speaker
Basic doesn't mean small.
00:16:04
Speaker
Let's be clear about that, right?
00:16:05
Speaker
A basic failure can be very large and consequential indeed, but it's just that single cause.
00:16:11
Speaker
And complex failure, on the other hand, is what I live, I think, every day in the ICU, right?
00:16:17
Speaker
Because it's a complex environment with every action has so many parts to it.
00:16:23
Speaker
And a lot of complex failures that we experienced in the ICU might be, for example, they were about to give the wrong type of blood, but they did a check and they recognized it.
00:16:33
Speaker
So they didn't give the wrong type of blood.
00:16:35
Speaker
So no harm to the patient, but still was a failure of the process, right?
00:16:42
Speaker
And that's the beauty of it is that so many of our process failures
00:16:46
Speaker
can be caught and corrected.
00:16:48
Speaker
And I think you're doing better at that nowadays.
00:16:51
Speaker
I think that like everything, right, talking about the problem is the first step.
00:16:56
Speaker
I think there are pockets of excellence in healthcare.
00:17:00
Speaker
But I think that if you were to take the average of the thousands of hospitals that we have in the United States, there's still a lot of room for opportunity.
Blame Culture in Healthcare
00:17:09
Speaker
I mean, better doesn't mean best.
00:17:12
Speaker
And intelligent failure, I just want to...
00:17:16
Speaker
Make the point that our audience in general, I think, is a very clinically oriented audience, Amy.
00:17:22
Speaker
I think it's very easy in the context of research to understand intelligent failure or in the context of development of new products.
00:17:31
Speaker
However, I would say that a great example of intelligent failure is COVID.
00:17:36
Speaker
some teams organized the delivery of care with small pilots and were learning very fast and adapting very fast because they were psychologically safe teams where everybody could say, why don't we do this this way?
00:17:49
Speaker
Why don't we do this the other way?
00:17:51
Speaker
And I think that the delivery of care is a great example of how we can be smarter about intelligent failures.
00:18:00
Speaker
That's actually a beautiful example in my view because it was new territory for everybody.
00:18:06
Speaker
And these don't have to be formal experiments like in a laboratory, but there was a lot of experimenting and, oh, we discovered we could make an ICU in a tent.
00:18:17
Speaker
I mean, there was a lot of iteration that went on and the more thoughtful you were in that iteration, the better off you were in making progress in that new territory.
00:18:29
Speaker
The other thing I wanted to ask you, Amy, is I've seen you and I've seen in some of your previous papers, and I'm sure also in the book you talk about it, on a spectrum of this is blameworthy to this is praiseworthy, right?
00:18:44
Speaker
There is a context that goes, and I think it's important to recognize, even when somebody does something wrong, commits an error, it's not always their fault.
00:18:54
Speaker
And especially when I talk to our clinical leaders, I think it's important to talk about that.
00:18:59
Speaker
Could you talk about that, but overlap it with the process knowledge spectrum?
00:19:05
Speaker
So the process knowledge spectrum basically points to the reality that in our processes, whether in healthcare or any other industry, there's a spectrum from
00:19:17
Speaker
highly routine work that we know we're good at.
00:19:21
Speaker
We know exactly how we want things to unfold.
00:19:24
Speaker
This is, of course, an automotive assembly line or, you know, I think about an area of medicine that is more routine than other aspects.
00:19:34
Speaker
And then we move to the right in the spectrum toward variable work, where we have a lot of knowledge about how to get the result we want, and there's a high need for customization or variability of arrival times.
00:19:48
Speaker
All sorts of complexity is introduced into our process by that.
00:19:53
Speaker
And if we keep going to the right, we get to what I call novel contexts, where the knowledge is very undeveloped.
00:20:00
Speaker
And so we have no choice but to
00:20:02
Speaker
experiment and see what works to develop the knowledge that may make some activity routine in the future, but it's not routine today.
00:20:12
Speaker
So that's, that basically that's a spectrum from very, very low uncertainty to very, very high uncertainty.
00:20:19
Speaker
Now, I would argue in all three of those domains, you can have both praiseworthy and blameworthy actions.
00:20:30
Speaker
And I suppose I'm a little bit of a purist on blameworthy.
00:20:34
Speaker
I like to really only call things blameworthy where the person was deliberately intent on causing harm or at least intent on not really trying in a situation where their effort was needed.
00:20:53
Speaker
And then we move to the right.
00:20:56
Speaker
And let's say someone is, you know, let's say the root cause of a failure is someone didn't pay attention.
00:21:02
Speaker
There's an airline accident in the book that happened 40 years ago, Air Florida Flight 90, that was just a single human error and not putting the anti-ice on led to this catastrophic fatal accident.
00:21:17
Speaker
So, you know, that that's not paying attention to
00:21:21
Speaker
isn't always blameworthy, but it can be blameworthy if, I mean, it may even be the supervisor who puts someone in a position where they just wouldn't be able to be alert for that long a time that led to the outcome.
00:21:37
Speaker
Whereas praiseworthy is any attempt to try, you know, any attempt to try something new and being fully cognizant of what's at stake so that you don't unnecessarily create risk
00:21:52
Speaker
especially safety risks, can be thought of as praiseworthy.
00:21:57
Speaker
And I think a perfect example for healthcare, and just thinking of my own listening of errors, when I was a critical care fellow, after a 36-hour shift, I punctured myself with a needle, right?
00:22:15
Speaker
I mean, and that probably was...
00:22:20
Speaker
That's, I mean, you didn't follow protocol, but it's a needle stick.
00:22:24
Speaker
And yet the context, and we'll talk about that later, is very important, right?
00:22:30
Speaker
I mean, anybody who's fatigued, anybody who's overworked is more likely to make those errors, right?
00:22:40
Speaker
And I remember that the shame of trying to get help for that.
00:22:44
Speaker
And I'm sure that this is a story that a lot of our listeners have experienced or know in different variations.
00:22:51
Speaker
And that's really why I think it's so important to talk about all this for health care.
00:22:57
Speaker
I mean, the human being was not designed to work for 36 hours straight and then be just as good at the end of that time as you were at the beginning of that time.
00:23:08
Speaker
So that is a context factor that matters greatly.
00:23:12
Speaker
And this is why I say we have to learn from failures thoughtfully, because if you don't take into consideration in your analysis,
00:23:20
Speaker
the contributing factors such as that, you will get faulty answers to your analysis.
Preventing Basic Errors
00:23:27
Speaker
So can we do a little bit of a deeper dive into the types of failure?
00:23:31
Speaker
So we talked about basic errors, which I think in the context of clinical medicine are well recognized today, right?
00:23:40
Speaker
Transfusing the wrong type of blood.
00:23:43
Speaker
and dosing somebody with 10x, right, which still happens, of a narcotic, amputating the wrong limb.
00:23:51
Speaker
These are all examples of basic failures which could have deadly consequences.
00:23:58
Speaker
And I think that, like everything in life, prevention, right, is worth a, what do they say, a ton of treatment.
00:24:08
Speaker
Could you talk about some of the strategies for preventing basic errors?
00:24:14
Speaker
You know, I often feel embarrassed to talk about this because it's so basic.
00:24:20
Speaker
And I'm going to be saying things that your listeners already know.
00:24:25
Speaker
But the fundamental preventions for basic failures are training.
00:24:30
Speaker
And so we don't want to short change training, whether we're in medicine or in manufacturing.
00:24:38
Speaker
We want to make sure people have what they need
00:24:41
Speaker
in being trained in procedures so that they have a good shot at doing them well.
00:24:47
Speaker
Failure proofing, right?
00:24:48
Speaker
You rarely leave instruments inside a patient anymore because you often have the instruments in nests that make it crystal clear.
00:24:57
Speaker
If something's missing, it's got to be somewhere you better find it before you sew up, for example, as just maybe a simple example of failure proofing.
00:25:05
Speaker
But you can think of all sorts of failure proofing techniques
00:25:09
Speaker
mechanisms that you build into your practice or build into your life.
00:25:16
Speaker
Sharing, psychological safety to help people speak up quickly and have blameless reporting of things that go wrong so that we can collect those data and be better informed about where the risks are is another strategy for preventing as many basic failures as possible.
00:25:37
Speaker
And and I guess that's you know, I guess those are the top ones that come to mind.
00:25:45
Speaker
But you can see how these aren't what I'll call rocket science.
00:25:49
Speaker
These are very basic, very fundamental.
00:25:52
Speaker
Examples of good practice.
00:25:58
Speaker
It's important to talk about these and think about them because a lot of people know these things, but it doesn't mean that it happens every time for every patient.
00:26:09
Speaker
And even in academic centers, there's still basic failures going on.
00:26:14
Speaker
But one of the areas that I wanted to dig a little bit deeper is checklists or codifying prevention because...
00:26:22
Speaker
It seems that since Atul Gawande's wonderful book and Peter Pronovost, I mean, really, I think kind of a revolutionary study with central lines, something that we do every day in the ICU, checklists are everywhere in health care.
00:26:40
Speaker
And what fascinated me, Amy, reading the book was the transcript of the, I think, the Florida airplane that you mentioned, right?
00:26:50
Speaker
Can you just talk a little bit about that?
00:26:52
Speaker
So, you know, first of all, I erred in a sense that should have been the very top of my list of failure, basic failure prevention is checklists.
00:27:02
Speaker
And and yet and I do this deliberately in the book.
00:27:06
Speaker
I include this Air Florida flight in 1982 that that crashed because despite, I'll say, despite the checklist, because the checklist was used in a highly mindless process.
00:27:20
Speaker
routine, almost autopilot, forgive the parallel manner.
00:27:28
Speaker
So the first officer said anti-ice and then the captain said off.
00:27:35
Speaker
And the correct answer, it was a January wintry, icy day.
00:27:40
Speaker
The correct answer is, of course, on.
00:27:43
Speaker
And that led to this horrific crash.
00:27:45
Speaker
So the takeaway, as you all know well, is that checklists are a very powerful, you know, profound tool even.
00:27:55
Speaker
And they must be used while awake, right?
00:27:58
Speaker
They must be used deliberately, thoughtfully.
00:28:01
Speaker
If we don't take them seriously, we don't see them as the useful tool that they are, and we just sort of go through the motions, as it were, then they won't do their job.
00:28:15
Speaker
And in terms of complex failures, we mentioned that these are also very, very prevalent in healthcare, especially in a setting like the one our listeners work in, which is the ICU.
Complex Failures in High-Stakes Settings
00:28:26
Speaker
It's not only high complexity, but high stakes on a daily basis.
00:28:30
Speaker
You did mention the Swiss cheese theory a little bit earlier.
00:28:34
Speaker
Could you just explain that a little bit more for some of our listeners may not be as familiar?
00:28:40
Speaker
So Jim Reason, who is the...
00:28:43
Speaker
Error, he's an error theorist,
00:28:47
Speaker
in the UK is the one who came up with this metaphor.
00:28:52
Speaker
And I like to say, if you think about the holes in Swiss cheese, any hole in your cheese is not something that contributes to your nutrition.
00:29:01
Speaker
So it's a kind of error, those little air bubbles.
00:29:06
Speaker
And he used this metaphor to say, it's fine for your cheese to have holes, no harm done.
00:29:11
Speaker
But when the holes line up,
00:29:14
Speaker
they create a tunnel that lets the failure flow through.
00:29:19
Speaker
So the idea is many of our processes are slightly flawed in a variety of ways.
00:29:25
Speaker
There's deviations here, there's deviations there.
00:29:28
Speaker
But every now and then the deviations all line up in such a way that allows the failure to just slip on right through.
00:29:36
Speaker
And why this is a useful metaphor is that it points out that
00:29:43
Speaker
These kinds of failures aren't the norm, of course, but they are preventable if we're really paying attention because all you have to do is catch one bubble.
00:29:56
Speaker
The tunnel won't work to allow the failure through if there's even one of the factors that gets noticed and altered before the failure happens.
00:30:07
Speaker
So I think it's empowering.
00:30:08
Speaker
I think it's meant to say, speak up.
00:30:13
Speaker
Even if you have very low confidence that you've spotted a concern, speak up anyway.
00:30:20
Speaker
Your voice is welcome here.
00:30:23
Speaker
And I think that in healthcare, a lot of our listeners may have been part of root cause analysis of some sentinel event or some complex failure.
00:30:32
Speaker
And as you go backwards, so for example, a child received a overdose of a narcotic and went into respiratory distress.
00:30:41
Speaker
If you go backwards, all the point to where the medication was prescribed, you might find...
00:30:48
Speaker
multiple opportunities for intervention that would have avoided that happening right I think you alluded to that I think the term that you use in your research is the recovery window and yes and I think that that's important because in retrospect it's easy to figure out right but right why doesn't it happen in real time
00:31:08
Speaker
And that's where, like you said, speaking up, psychological safety and paying attention is really important.
00:31:14
Speaker
I think a lot of times also people really, whether it's imposter syndrome or common knowledge effect, they assume that this doesn't feel right, but it's okay.
00:31:24
Speaker
Somebody else must know better than me.
00:31:27
Speaker
And that's a big problem.
00:31:29
Speaker
They assume, oh, I probably just didn't read up on the latest literature or I'm new here or, or, or.
00:31:37
Speaker
And they make the assumption, which is fine in your social life.
00:31:42
Speaker
I don't think I'll speak up about that right here, right now.
00:31:44
Speaker
But in health care, and I would say in work environments more generally, we have to flip that script and say, no, no, no.
00:31:52
Speaker
Even when in very real doubt, we need to hear from you.
00:31:59
Speaker
And finally, we have intelligent failures.
Value of Intelligent Failures
00:32:01
Speaker
And one of the things I want to make sure our listeners understand is that we can always learn from failure.
00:32:09
Speaker
That doesn't make it the right kind of wrong, right?
00:32:12
Speaker
Maybe it's failing well is we can talk about that later, but intelligent failures specifically have a very precise definition and setting.
00:32:22
Speaker
Could you talk a little bit about that in the context of healthcare?
00:32:29
Speaker
I think there's really four questions to ask.
00:32:33
Speaker
Is this new territory or is there existing available knowledge that I can look up and use?
00:32:40
Speaker
And do you glimpse an opportunity?
00:32:45
Speaker
So it's opportunity driven.
00:32:46
Speaker
There's a goal, taking care of patients, a new treatment, or even just getting better at a particular procedure, that you've got a goal in mind that makes it worth the
00:32:59
Speaker
Modest experimentation that's about to happen.
00:33:02
Speaker
Have you done your homework?
00:33:04
Speaker
Have you read the literature or talked to experts or found out what you could about what is known before you go off into this unknown territory to try something new?
00:33:14
Speaker
And then finally, and so importantly in health care especially, the experiment.
00:33:20
Speaker
should be as small as possible.
00:33:22
Speaker
You want to mitigate the risks that are inherent in new territory by keeping your experiments just large enough to learn, but not so large that you're actually wasting resources or putting undesirable risks in there.
00:33:44
Speaker
And I think that the pilot and the size of your pilot, for example, I think is very valuable.
00:33:50
Speaker
I often think of, I guess, in the tech world or in the startup world, we're talking about the minimum viable product, right?
00:33:59
Speaker
How much do you need to invest in effort, time, and money to get an idea if it's going to work or not, right?
00:34:05
Speaker
And this is very common in healthcare delivery as a mistake.
00:34:08
Speaker
People say, oh, we got to change this, and they change it across a large scale, and then it doesn't work.
00:34:13
Speaker
right that would be i think a great example of clinicians using intelligent failure to improve the way they run their office to when they would run their icus and that is outside of the context of research which a lot of our physicians and app's who listen to us don't do yes that's right and it's i mean i think this the size issue is quite subtle and quite important which is
00:34:37
Speaker
You always want to have I love the term minimum viable product or, you know, minimum viable process.
00:34:42
Speaker
Like if you can you must use your, you know, your brain to design the the experiment such that it's just big enough to be valid to learn from.
00:34:56
Speaker
But no bigger as to create waste.
00:35:00
Speaker
So if it's a new procedure in your office, test it out with a particular subset of patients or test it out on Mondays or whatever the appropriate move is.
00:35:12
Speaker
But don't roll things out before they're ready for prime time.
00:35:20
Speaker
So the way I process all this, Amy, is that failing well is understanding the different types of failure.
00:35:30
Speaker
It's putting every effort we have to prevent basic and complex failures.
00:35:36
Speaker
And when those occurs, it's trying to truly learn from them so we can improve our processes.
00:35:41
Speaker
And in the right context, it's applying intelligent failure.
00:35:45
Speaker
Now, your audio expertise is the science behind that.
00:35:49
Speaker
Can we talk a little bit about that?
00:35:52
Speaker
Well, first of all, I just want to give you an A plus because that's exactly right.
00:35:56
Speaker
And I don't think everybody appreciates that.
00:35:59
Speaker
Maybe they're out there reading the sort of short digital articles on this topic and they decide that failing well is all about the fail fast break things.
00:36:09
Speaker
But to me, what you just said is exactly right.
00:36:11
Speaker
Failing well is about preventing as many
00:36:16
Speaker
bad failures, basic and complex, as humanly possible.
00:36:19
Speaker
And it's about increasing our willingness to engage in smart experimentation to make progress in new territory and welcoming the still undesired, but welcoming the failures that come our way.
00:36:33
Speaker
So that's exactly right.
00:36:35
Speaker
So being able to distinguish, but also being really good preventers and then really good experimenters.
00:36:43
Speaker
So one of the words that I hear a lot these days when we talk about failure in healthcare is high reliability organizations or HROs.
00:36:52
Speaker
Could you talk a little bit about those?
00:36:54
Speaker
And is that just a fancy name for a psychologically safe environment?
00:37:01
Speaker
And I think it's a fancy name for a psychologically safe environment, but that's not all it is.
00:37:06
Speaker
So I think HROs are,
00:37:09
Speaker
High reliability organizations are profoundly psychologically safe environments because one of the principles is that people at any level of a hierarchy will speak up immediately and forcefully if they see something wrong.
00:37:24
Speaker
So these are organizations that are really, really good at catching and correcting deviations.
00:37:29
Speaker
They understand that deviations will happen.
00:37:32
Speaker
And so they're not, I don't think they would ever use the term failure is not an option because what they understand is failure is
00:37:39
Speaker
Always a possibility.
00:37:41
Speaker
More than the rest of us, people in HROs truly appreciate that failure is always a possibility.
00:37:48
Speaker
So they don't want them to happen if they're preventable.
00:37:51
Speaker
So they're, you know, acutely sensitive and to, you know, to subtle changes and committed to speaking up about them.
00:38:01
Speaker
So what it has in addition to psychological safety, which is that permission for voice, is it has just a high level of sensitivity, that people have a mindset that says, yep, things could go wrong.
00:38:12
Speaker
And that they have a high level of just, what's the word I'm looking for?
00:38:19
Speaker
Carl Weick, who has written about this, calls it heedful interrelated.
00:38:27
Speaker
aware of what each other is doing.
00:38:29
Speaker
It's not just, okay, let me do my job, you do your job and all will be well.
00:38:32
Speaker
It's sort of, there's a lot of very thoughtful interactions.
00:38:38
Speaker
And so I think HROs really appreciate that failures in their context, whether it's nuclear power or an aircraft carrier, are so consequential that they really, really, really are committed to them not happening.
00:38:53
Speaker
And I think the interrelativeness awareness in healthcare is important because A, we're a team, but more, I think more significant, we live in silos.
00:39:04
Speaker
And I think historically in healthcare, you know, long, long ago when you had the horse and buggy doctors and, you know, you essentially had clinicians with their shingles and they couldn't do much for you, but you know, they were good people doing their best with the knowledge that was available at the time.
00:39:23
Speaker
As medicine has gotten more and more complex and more and more subspecialties and interdisciplinarity, our mindsets, your mindsets, haven't always caught up, right?
00:39:37
Speaker
So there's still this idea, I'm a professional, I'm an expert, I should have what it takes and know what to do to care for my patients.
00:39:45
Speaker
And that's fine and good.
00:39:47
Speaker
And also one needs to be aware that
00:39:51
Speaker
in to care for my patients, especially the hospitalized ones, I am utterly dependent on so many others, you know, at the bedside and in other parts of the organization where, where back office things are happening, where, where drugs are being, you know, put into cartridges and all of the rest.
00:40:10
Speaker
So it's, it's just a profoundly interdependent system and, and,
00:40:17
Speaker
your mental models haven't always kept up with that interdependence.
00:40:23
Speaker
The other thing I wanted to ask you about, Amy, is the power of shame versus embracing vulnerability.
00:40:33
Speaker
Yeah, so to me, they're very different things, but I can see how people would
00:40:38
Speaker
confuse them or sort of think of them as highly related.
00:40:42
Speaker
So embracing vulnerability to me is simply embracing a fact, right?
00:40:48
Speaker
Each and every one of us is vulnerable.
00:40:51
Speaker
We're vulnerable because we can't see the future.
00:40:53
Speaker
We're vulnerable because we're fallible human beings and things both we will make mistakes and our systems will have little breakdowns.
00:41:02
Speaker
So we're vulnerable by fact.
00:41:06
Speaker
the only real question mark, our only decision is whether we're willing to acknowledge that and sort of use that fact to shape our thinking and shape our behaving.
00:41:18
Speaker
So that's just, to me, it's a strength.
00:41:20
Speaker
Like vulnerability means I am strong enough to know that I'm vulnerable.
00:41:27
Speaker
Whereas shame is to believe really quite erroneously that
00:41:32
Speaker
That vulnerability is somehow wrong and indicates that I'm not a good person or I'm not a worthy person.
00:41:42
Speaker
And I think as a leader, it's important to be vulnerable at the right times.
00:41:47
Speaker
You also want to inspire confidence to your team.
00:41:50
Speaker
But like when somebody makes a mistake, we fail, we made a mistake, let's move forward and try to learn from this, right?
00:41:58
Speaker
It invites others to say the same thing.
00:42:02
Speaker
I mean, first of all, if you're a leader in healthcare, you must go first.
00:42:06
Speaker
You must demonstrate the example of honesty, of speaking up, of acknowledging when you've come up short.
00:42:12
Speaker
It's just you must or else no one else will.
00:42:16
Speaker
But also, I think you can both acknowledge your individual vulnerability and acknowledge the vulnerability of, say, our ICU, because those are just facts, but
00:42:30
Speaker
while also express extreme confidence in our ability to deliver great care.
00:42:37
Speaker
When we are at our best, when we are working together, when we are honest, when we are asking for help, when we're in over our heads, we can really do almost anything.
00:42:47
Speaker
So it's not, you know, to acknowledge your individual vulnerability is not to give up on the strength of your teams or the strength of your organizations.
00:43:00
Speaker
I read that becoming is better than being, which I think is another way of saying learning over knowing.
00:43:07
Speaker
And can you talk a little bit about in this pursuit of learning from from from all failures, a little bit of how we can reframe failure and how can we play to win versus not to lose?
00:43:20
Speaker
Well, I think we reframe failure as a necessary or inevitable part of life, especially a full life, especially a life of achievement, even, that I have not met anyone who's achieved a lot, who hasn't experienced lots of failures along the way.
00:43:37
Speaker
And so the reframe has to be from failure is bad and shameful, and I don't want to have it, or I don't want to admit it if I do have it, to failure.
00:43:47
Speaker
Failure is on the path to successes and working together and at our very best, we can prevent the preventable ones, the basic and complex ones, and we can embrace
00:44:01
Speaker
the intelligent ones in new territory.
00:44:04
Speaker
So that is the reframe.
00:44:06
Speaker
And I believe that as I describe it that way, you begin to realize this is kind of a team sport.
00:44:12
Speaker
And I don't think you can do any teaming whatsoever if you're not willing to choose learning over knowing.
00:44:19
Speaker
And that has to be a kind of explicit override because our default state is to believe we know, you know, I believe I'm right, you're wrong, you know, when we disagree.
00:44:29
Speaker
And I have to kind of
00:44:30
Speaker
train myself to say, Oh, wait a minute, you know, maybe I'm missing something.
00:44:36
Speaker
And, and that's not a shameful thing.
00:44:38
Speaker
That's like just a fact of life.
00:44:40
Speaker
And now I'm curious.
00:44:41
Speaker
Now I want to know what you see that I miss, because it's going to help me.
00:44:47
Speaker
And what about in the book, and some of your research, you talk about stop challenge choose?
00:44:53
Speaker
How do we incorporate that?
00:44:55
Speaker
Well, a stop, challenge, choose is a little kind of cognitive habit that I suggest our listeners practice.
Decision-Making in Uncertainty
00:45:02
Speaker
And that is the sort of the self-discipline to pause, right?
00:45:07
Speaker
If there's any uncertainty at all, which is often the case, just pause for a moment and check what you're thinking and then
00:45:16
Speaker
Like, I wonder what I'm missing.
00:45:18
Speaker
And then choose the path of learning.
00:45:20
Speaker
And that can be in a disagreement with your partner.
00:45:24
Speaker
That can be in a, you know, a clinical intervention you're about to make.
00:45:30
Speaker
Just pause, take a close look at what you're thinking and why.
00:45:35
Speaker
And then, you know, to see whether there could be any gaps that are important here.
00:45:41
Speaker
And then choose a learning path.
00:45:44
Speaker
path forward the last thing i wanted to ask you as we as we start closing amy is obviously um great people and bad systems will underperform and we don't think enough about systems in health care any any comments on how we incorporate that into everything we discussed
00:46:07
Speaker
Well, you know, in the book, I use the example of Children's Minnesota just because I have studied it.
00:46:15
Speaker
I studied it a while ago closely as part of the patient safety research.
00:46:20
Speaker
I use it as an example of an organization that I think any organization can do without any health care organization without a ton of cost or costs.
00:46:33
Speaker
new equipment or anything like that.
00:46:35
Speaker
It's just a kind of a mindful recognition of the way the different elements of a system, you know, your staffing systems, your training systems, your technology, your mindsets, your culture, your reporting systems, and all the rest, how they
00:46:56
Speaker
interact in ways that create more than the sum of the parts.
00:46:59
Speaker
I think we have a tendency, certainly in management, we have a tendency to look at elements and to analyze the heck out of elements.
00:47:06
Speaker
And we're less likely to look at how different elements of the system relate to each other and create unintended consequences or create mutually reinforcing strengths that we want to make sure we design on purpose.
00:47:23
Speaker
We covered, I think, a lot of ground.
00:47:26
Speaker
And as we move from understanding and theory to action, I think Herb Spencer said many, many centuries ago that the whole point of education is not knowledge, but action, right?
00:47:39
Speaker
Any tips and pearls for our ICUs and moving towards feeling better?
00:47:44
Speaker
Well, I think the most important action in the ICU is the action of speaking up.
00:47:49
Speaker
It is the action of asking for help.
00:47:51
Speaker
It's the action of, of,
00:47:53
Speaker
Asking good questions.
00:47:55
Speaker
If you're a clinician leader in an ICU environment, your best action is to pause and say, ah, what am I missing?
00:48:03
Speaker
Ask people, what did you see last night?
00:48:05
Speaker
What thoughts do you have?
00:48:07
Speaker
Maybe it'll only be one out of 10 times that you hear something that you didn't expect, but that one will be well worth it.
00:48:15
Speaker
So the crucial action, I would argue, is the action of inquiry.
00:48:20
Speaker
invite people to participate and listen to understand, right?
00:48:26
Speaker
So Amy, you've been on the podcast before.
00:48:28
Speaker
We'd like to close with a couple of questions unrelated, I guess, to the topic.
00:48:32
Speaker
Although in this case, they will be related to the topic.
00:48:35
Speaker
The first question is any book or books you would recommend for our readers after they've done with the right kind of wrong to expand their journey and learning about failing well?
00:48:46
Speaker
Yes, I'm going to recommend ADAPT by Tim Harford, which kind of takes, he's a beautiful writer and it takes on some of these issues in more of the management space.
00:48:58
Speaker
And then in this sort of psychological and sociological space is a beautiful book called Being Wrong by Katherine Schultz, which goes more deeply into our sense of shame and discomfort with being wrong and how, again, that's something we have to learn to welcome.
00:49:16
Speaker
And we'll definitely link those in the show notes.
00:49:18
Speaker
And also we'll link our previous conversation on psychological safety, which I think, as you said, is the underpinning for all of what we discussed today.
00:49:28
Speaker
And the last question is, as we close, could you share with us your favorite failure?
00:49:34
Speaker
Well, my favorite failure, which is in the book, concerns Ray Dalio, who was a tremendously successful person in finance.
00:49:42
Speaker
His investment business, Bridgewater Associates, has been one of the most successful companies in history, and he's the entrepreneur who started it.
00:49:51
Speaker
But seven years into
00:49:54
Speaker
that company's existence, Dalio had made an absolutely wrong bet in new territory about the economy and lost everything he had.
00:50:05
Speaker
Now, I think our leaders will notice that is not an intelligent failure.
00:50:09
Speaker
Everything about it was fine, except the size, right?
00:50:12
Speaker
It was new territory.
00:50:13
Speaker
He had a hypothesis.
00:50:16
Speaker
And yet that experiment was much too big for uncertain territory.
00:50:22
Speaker
So the reason it's my favorite failure is not just because it misses being intelligent, but even more because he came away from it a changed man.
00:50:31
Speaker
He says, in retrospect, that failure was the best thing that ever happened to me.
00:50:37
Speaker
It taught me to temper my arrogance, my confidence, and replace it with a kind of humility that says,
00:50:45
Speaker
Instead of I'm right, I wonder why I think I'm right.
00:50:49
Speaker
That is the shift that I think great clinicians can and should make.
00:50:53
Speaker
It doesn't involve being insecure.
00:50:56
Speaker
It just involves being strong enough to say, I wonder why I think I'm right.
00:51:01
Speaker
And then you get curious.
00:51:04
Speaker
And so I think it was a profound insight for him and I think for others as well.
00:51:09
Speaker
And I think it speaks to the idea that it's not about having answers, but having great questions.
00:51:19
Speaker
Amy, always a pleasure.
00:51:21
Speaker
Thank you for putting out such wonderful work, both in the research level, but also in framing it in a way that everybody can understand with your books.
00:51:32
Speaker
Like we were speaking beforehand, I think this is critical for all clinicians, especially in the ICU.
00:51:39
Speaker
And I look forward to more in the future.
00:51:43
Speaker
And again, thank you so much for sharing your time and expertise with us.
00:51:47
Speaker
Well, thank you for the work that you do.
00:51:49
Speaker
It really matters.
00:51:51
Speaker
Thank you for listening to Critical Matters, a sound podcast.
00:51:55
Speaker
Make sure to subscribe to Critical Matters on Apple or Google Podcasts and share with your network.
00:52:01
Speaker
Sound's transforming the way critical care is provided in hospitals across the country.
00:52:05
Speaker
To learn more, visit www.soundphysicians.com.