Introduction to Critical Matters
00:00:06
Speaker
Welcome to Critical Matters, a sound podcast covering a broad range of topics related to the practice of intensive care medicine.
00:00:14
Speaker
Sound provides comprehensive critical care programs to hospitals across the country.
00:00:19
Speaker
To learn more about our programs and career opportunities, visit www.soundphysicians.com.
00:00:26
Speaker
And now your host, Dr. Sergio Zanotti.
Judgment in COVID-19 Pandemic
00:00:33
Speaker
The plan for today is to
00:00:35
Speaker
cover three areas.
00:00:36
Speaker
I want to talk a little bit about judgment during COVID-19, talk about uncertain situations and uncertain clinical, but also operational circumstances and how we have behaved during this pandemic and how some of those behaviors have impacted our judgment, our way of thinking, relationships, our patient care, and try to maybe look at that lens
00:01:03
Speaker
to understand a little bit about how we can become better, better decision makers at the bedside, but also when we're trying to organize and lead programs.
Heuristics and Biases in Clinical Settings
00:01:13
Speaker
So we're going to start with the why, which is kind of the pandemic as an example of what happens under uncertain conditions.
00:01:20
Speaker
We're going to dive a little bit into understanding the science behind the way we think, the cognitive process, that's the how.
00:01:28
Speaker
And what's very important, and I think this came up in other conversations,
00:01:33
Speaker
is to really understand that when we talk about heuristics, when we talk about biases, they are present 100% and universally in every thinking human being.
00:01:43
Speaker
So clearly, these are processes that are inherent to the way we think and understanding them, I think, can help us make better decisions.
00:01:51
Speaker
And finally, I'll share some thoughts on how we can become better decision makers
00:01:57
Speaker
And again, applying that both at the bedside, but also at our leadership roles as we try to introduce change for our programs.
Decision vs. Choice in Medicine
00:02:09
Speaker
And the first thing I wanted to do before we dive into COVID is just make a distinction between what's a choice and what's a decision.
00:02:18
Speaker
From a semantic standpoint and from an academic standpoint, they're not the same, but I think it also informs the conversation we have
00:02:25
Speaker
today in terms of what we're really talking about.
00:02:28
Speaker
So a choice is when you have multiple options and what you need to do is implement a value judgment.
00:02:34
Speaker
So a lot of the circumstances might be known and you're comparing, let's say apples to apples, and at the end of the day, you have to make a value judgment to make that choice.
00:02:45
Speaker
What we're talking about today is situations that require not only a value judgment, but also a probabilistic judgment
00:02:53
Speaker
that means that there is an element of uncertainty.
00:02:56
Speaker
And that is really a decision.
00:02:59
Speaker
And decisions apply to the bedside a lot of times.
00:03:02
Speaker
Decisions apply to when we try to create change and create value at our programs.
00:03:08
Speaker
And there's been no other time where uncertainty has been more relevant and probably more present in our decision-making than the last 12 months with the COVID pandemic, where a lot of that uncertainty that we encounter on a daily basis in life
00:03:22
Speaker
was really magnified, amplified, and really, I mean, to a scale that we have not seen.
Infodemic and Misinformation
00:03:28
Speaker
So now we are almost at 140 million cases worldwide, 3 million deaths almost worldwide, and it's something that a lot of our teams are still dealing with.
00:03:38
Speaker
But I felt that it would be a good prism through which to see why we think the way we think, why we make certain decisions, why we behave sometimes at the bedside in the ways that we've seen.
00:03:51
Speaker
And one of the things
00:03:52
Speaker
that has been very unique about this particular pandemic.
00:03:56
Speaker
Pandemics are not obviously isolated in terms that they have existed in the history of mankind and bigger pandemics have befallen us.
00:04:05
Speaker
But something unique about this pandemic has been the infodemic that has been associated with it.
00:04:10
Speaker
And that has really created very polarized positions
00:04:15
Speaker
both in society, but also in the scientific community about many, many items related to this pandemic.
00:04:23
Speaker
Information spreads very, very rapidly.
00:04:25
Speaker
Today in 2021 with social media, with the connectivity people have, information spreads like a virus.
00:04:33
Speaker
And the problem with that is that misinformation and disinformation, especially when it's exciting, can spread very, very fast, even faster.
00:04:42
Speaker
And that can be deadly.
00:04:43
Speaker
And there's no question that everything, every day, there's a new exciting news that is spreading so, so quickly from the beginning of the pandemic till yesterday.
00:04:53
Speaker
And I'm sure today I've been traveling today and busy.
00:04:56
Speaker
I have not looked at the news yet, but I'm sure that if I look, there's something new about a vaccine or something new about a potential treatment or a new surge or something related to the pandemic.
00:05:06
Speaker
So just in terms of the spread of information, it can really spread almost at the same rate.
00:05:10
Speaker
And we've seen that.
00:05:12
Speaker
definitely with this COVID-19 pandemic.
00:05:15
Speaker
There's information, which is information that is good, that is faithful, that is helpful.
00:05:20
Speaker
There is misinformation that can be information that, even though it might be well intended, is misguiding.
00:05:28
Speaker
And there is disinformation, which is information that is not only misguiding and incorrect, but is created with the intent to misguide, mislead,
00:05:38
Speaker
And there's plenty of that as well in the COVID-19 pandemic and in the current environment.
00:05:45
Speaker
When you look at studies that have looked at the themes, almost the vast majority of misinformation has been around miracle cures.
00:05:56
Speaker
And also there's been a big percent of them around the new world order or deep state.
00:06:03
Speaker
And then that the COVID-19 pandemic was actually a hoax.
00:06:07
Speaker
So a lot of these things are obviously being actualized.
00:06:09
Speaker
Now, I'm sure vaccines are also part of that.
00:06:13
Speaker
But just to give you an idea of some of the topics that people have shared and what's considered to be classified as misinformation or clearly blatantly false information that has really misled the public opinion in this pandemic.
00:06:28
Speaker
However, the infodemic does not only apply to society.
Cognitive Biases in Scientific Interpretations
00:06:34
Speaker
It really applies to the scientific community as well.
00:06:37
Speaker
And you can see on the left a graph that shows the first six months of the pandemic and the rapidly, almost exponential growth of total COVID-19 publications.
00:06:49
Speaker
If you look at COVID-19 in the last, whatever, in PubMed and you just put that COVID-19 as your search, you will find that since 2020, there's been over 124,000 peer reviewed PubMed recognized publications on COVID-19.
00:07:06
Speaker
That is a overwhelming number of scientific information that clearly creates a tremendous struggle for clinicians that are dealing with a novel disease with overwhelming number of patients.
00:07:21
Speaker
And I believe that we'll see that a lot of that information has really created some very interesting dynamics.
00:07:29
Speaker
So obviously, especially when you look at the ICU, why is it that some people
00:07:34
Speaker
have been so adamant that ivermectin works and other people that ivermectin does not work.
00:07:40
Speaker
People have taken positions regarding therapies and even when evidence has emerged that suggests that those therapies may not work, they have really not changed their behavior.
00:07:51
Speaker
Or the converse, people are very adamant that something does not work very well, let's say steroids, and then even when there's evidence to suggest that it does work, they still want more evidence.
00:08:02
Speaker
And how is it possible that we have such polarizing views about the same piece of scientific publication so people can read the same article and have very different views in terms of what it really means?
00:08:15
Speaker
And that obviously has presented a tremendous challenge for our clinicians, but also I think it illustrates a lot of the cognitive processes that we have embedded in our wiring that are very, very hard to overcome.
00:08:30
Speaker
And what I hope to do today
00:08:32
Speaker
is to illustrate some of those processes so that we can recognize them and then apply maybe some techniques that will make us better decision makers both at the bedside and outside when we're trying to promote value and change.
Evaluating Decision-Making Processes
00:08:47
Speaker
So let's start with a couple of questions.
00:08:49
Speaker
So this is the Super Bowl a couple of years ago.
00:08:52
Speaker
New England was winning 28 to 24.
00:08:55
Speaker
Seattle had the ball in the end zone, basically very close to the touchdown.
00:09:02
Speaker
They had 26 seconds left in the fourth quarter and second and goal.
00:09:08
Speaker
And Pete Carroll calls a passing play.
00:09:15
Speaker
Tom Brady wins yet another ring.
00:09:17
Speaker
And on Monday, every single person who writes about football says that was a bad decision.
00:09:22
Speaker
We're not going to analyze that yet.
00:09:24
Speaker
I'm just going to take that now to the clinical arena.
00:09:27
Speaker
You have a patient with COVID-19 on vapotherm.
00:09:31
Speaker
you decide to keep the patient on BAPOtherm escalating levels of oxygen, you delay intubation, you delay intubation or you think that the patient's okay, you start doing some proning, you give them some medications, and then eventually the patient starts to have more issues and you decide to finally intubate the patient, you intubate the patient, the patient codes and the patient dies.
00:09:54
Speaker
Was that a good or a bad decision?
00:09:56
Speaker
So what's interesting is that in both of these situations that we've either experienced as spectators or experienced as protagonists in the ICU, the main thing that people will utilize to decide if this was a good or bad decision is the outcome.
00:10:11
Speaker
And that is the first thing that I want to talk about very briefly in terms of evaluating the quality of decisions based on outcomes.
00:10:19
Speaker
And if there's anything that you learned today, I would like you to learn that outcomes are not the best
00:10:25
Speaker
indicators of the quality of a decision and its process.
00:10:29
Speaker
And one of the reasons is because there is something called the outcome bias, which is judging a decision based on the outcome, rather on how exactly that decision was made in the moment.
00:10:41
Speaker
Just because you won a lot of in Vegas doesn't mean that gambling your money was a smart decision.
00:10:47
Speaker
On the other hand, there's a lot to learn about decision making from professional poker players.
00:10:52
Speaker
And they recognize the element of luck
00:10:55
Speaker
And they never evaluate how they played a hand, at least the professionals that are really successful based on the outcome of that game.
00:11:04
Speaker
They really think about how they were thinking in probabilistic terms.
00:11:08
Speaker
And this is very important because if you do the right thing but have a bad outcome, which is very common in the ICU, and you base the quality of your decision on the outcome,
00:11:20
Speaker
you will not do that same thing next time in a similar situation, and that might be detrimental to your patient.
00:11:25
Speaker
On the other hand, if you get lucky and you have a good outcome, that reinforces that same process, which could be harmful for your patient.
00:11:34
Speaker
So when you think about process always trumps outcome, if you have a good outcome, if you have a good process and you have a good outcome, over time, that's kind of what's going to happen long-term, and that's inevitable.
00:11:48
Speaker
The more your process improves,
00:11:50
Speaker
the more likely you're going to have good outcomes.
00:11:52
Speaker
If you have a good process and a bad outcome, that's really related to luck.
00:11:57
Speaker
And that's a short-term effect.
00:11:58
Speaker
But if you recognize that was a good process and you improve your process and continue to do it over time, you will move up to the inevitable of good outcomes.
00:12:08
Speaker
On the other hand, if you have a very bad process or no process, which happens a lot of times when we talk about quality improvement or clinical thinking, and you have a good outcome, that's pure luck.
00:12:18
Speaker
And it's short term.
00:12:19
Speaker
And if you think that that's a good process or that what you did was something that you should continue to do just based on the outcome over time, long term, inevitable, you will have
Adopting a Scientific Mindset in Medicine
00:12:29
Speaker
a greater share of bad outcomes.
00:12:31
Speaker
So one of the first things that we need to learn is that we tend to give tremendous weight to outcomes and evaluating decisions and evaluating behaviors.
00:12:41
Speaker
And we have to be careful because especially when we're dealing with a disease like COVID-19,
00:12:46
Speaker
Some patients will have a bad outcome.
00:12:48
Speaker
It doesn't mean that the process or the thought process was wrong.
00:12:52
Speaker
And on the other hand, just because we were lucky and had a good outcome doesn't mean that that's the right process.
00:12:57
Speaker
And we have to be able to evaluate our decisions based on other metrics.
00:13:02
Speaker
So some of the metrics that have been suggested for decisions in general include, are we having an appropriate frame?
00:13:09
Speaker
Are we framing the decision or the question we're trying to make in terms of the problem and what we need to achieve in the right way?
00:13:16
Speaker
Are we being creative about the alternatives?
00:13:18
Speaker
Are we trying to expand the options as opposed to narrow the options?
00:13:23
Speaker
Is the information that we're utilizing meaningful and reliable?
00:13:28
Speaker
And we'll talk a lot about bias because we all have biases.
00:13:31
Speaker
And if you don't check and unbiased yourself, those can mislead you systematically.
00:13:37
Speaker
Is there a clarity about the desired outcomes?
00:13:39
Speaker
Do we understand what are the acceptable trade-offs we're trying to make as we try to navigate this decision?
00:13:45
Speaker
Is there solid reasoning and sound logic, which I think is the most important thing.
00:13:49
Speaker
Just because you believe something works and then if it works, doesn't mean that that was a good decision.
00:13:54
Speaker
As physicians, we'll see that we wanna have a process that we can continue to improve so that in a situation like COVID, we continue to learn, but also with any decision that we make within the realm of our profession, that we can identify what works, what doesn't work, what is a good process, what's a bad process and keep refining that.
00:14:14
Speaker
its commitment to action as we make decisions.
00:14:17
Speaker
Did all the stakeholders take the necessary steps to achieve effective action?
00:14:21
Speaker
And that really applies to group decisions or decisions that affect obviously a team, which a lot of times are very relevant to what we do in the ICU.
00:14:32
Speaker
When we're thinking, most of the time, unfortunately, and this is something that I saw on COVID play out in social media, in the chat groups that teams had, in team meetings,
00:14:44
Speaker
in discussions in the hospital.
00:14:45
Speaker
One of the problems that we have, I think, in general as decision makers and as thinking human beings is that we usually are thinking along the lines of one of these three characters.
00:14:58
Speaker
Either we are in the preacher mode, in which case we believe we know the truth, we're absolutely sure we know the truth, and we're trying to evangelize that truth and share that idea with others.
00:15:10
Speaker
Or we're in the prosecutor mode,
00:15:12
Speaker
in which our only intent is to bring down the merits of somebody else's idea.
00:15:17
Speaker
So for example, if I believe that Ivermectin doesn't work, all I would do as a prosecutor mode is kind of poke holes to my colleague who says that Ivermectin actually works for COVID.
00:15:29
Speaker
And the problem with the preacher and the prosecutor mode is that we're really not open to rethinking our positions and nobody's infallible, no matter how smart you are.
00:15:38
Speaker
And that means that that puts you in a very risky position
00:15:40
Speaker
of being very rigid with the way you think and not being a very good learner, which is ultimately the only way that you become a more efficient decision maker.
00:15:49
Speaker
The third role is a politician who's basically in a mode where they're trying to play to a public or share what they think the public wants to hear.
00:15:59
Speaker
And obviously, that can be when we're trying to sway a group in one way or the other,
00:16:04
Speaker
And in COVID, it might be applied to what are the changes that we might need to do in a hospital, in an ICU.
00:16:10
Speaker
And again, when you're in politician mode, you're not so concerned with learning and asking questions, but more on appeasing people and trying to tell them or selling them something that you believe they want to hear or somebody else wants to hear.
00:16:25
Speaker
So all of these are present in every single one of us.
00:16:29
Speaker
If you really think about this, whether you're talking about treatments for COVID,
00:16:35
Speaker
what to do for a surge unit, differences in diversity, political issues, you probably revert to one of these.
00:16:44
Speaker
And these are not the best place to be as a clinician.
00:16:48
Speaker
What you really want to be is a scientist.
Daniel Kahneman’s Cognitive Systems Theory
00:16:50
Speaker
And a scientist starts by that circle in the middle and recognizing that the things they know they know are the smallest.
00:17:02
Speaker
Then there's things that they know
00:17:04
Speaker
a little bit larger.
00:17:05
Speaker
There's things they think they know.
00:17:08
Speaker
But the most important thing of being a true scientist is that the things you do not know is the biggest circle by far.
00:17:16
Speaker
And that really leads to one of the critical, critical aspects of being a better decision maker and being a better learner, which is intellectual humility.
00:17:27
Speaker
If you are humble from that perspective and recognize that no matter how much you know,
00:17:32
Speaker
There is still significantly more that you don't know.
00:17:38
Speaker
Doubt can lead to curiosity, which is about asking questions and not giving answers.
00:17:43
Speaker
And finally, with the right questions, we can lead to discovery of what is better, what moves things forward, and what makes us better.
00:17:52
Speaker
So the goal as decision makers, especially under uncertainty, is not to behave like a preacher, not to behave
00:18:02
Speaker
like a politician, not to be a prosecutor, but when we're debating ideas, when we are trying to learn, what we're trying to make big decisions is to be like scientists and ask questions, doubt, be humble.
00:18:17
Speaker
That is what I want you to think about today.
00:18:20
Speaker
And in order to get to that point, what we'll do is we'll start by looking at the cognitive process in terms of the how, how we think, how we all think,
00:18:30
Speaker
recognize some of the limitations that we have in the way we're wired.
00:18:34
Speaker
And then finally talk about the what or what are the steps that you can take to make better decisions at the bedside and outside of that arena.
00:18:43
Speaker
So as a rapid fire exercise, I just want you to think about this as quickly as you can.
00:18:48
Speaker
A bat and a ball cost $1.10 in total.
00:18:51
Speaker
The bat cost $1 more than the ball.
00:18:53
Speaker
How much does the ball cost?
00:18:55
Speaker
And probably the first thing that came to mind is 10 cents.
00:19:01
Speaker
If you think for a little bit, right, if it's 10 cents and the bat cost $1 more, that's $1.10, so it's $1.20.
00:19:08
Speaker
So actually, the right answer is 5 cents.
00:19:10
Speaker
And the bat cost $1 more, so that's $1.05 for a total of $1.10.
00:19:14
Speaker
So if you try to answer that question as quickly as possible, you first fall on $1.
00:19:29
Speaker
And then when you stop and think about it, you can kind of recognize that you were wrong.
00:19:35
Speaker
And that is because it's been demonstrated, and this is actually from Daniel Kahneman's research, and he popularized this in a phenomenal book, Thinking Fast and Slow, highly recommended, where he talks about two systems.
00:19:51
Speaker
System one, which is intuition and instinct, and 95% of the time we're operating in system one.
00:19:58
Speaker
It's unconscious, fast, associative, automatic pilot.
00:20:02
Speaker
Now, system one might be driving, and obviously a Formula One driver probably has much better intuition and can navigate much more difficult circumstances without activating system two than somebody who's just learning to drive.
00:20:15
Speaker
System two is rational thinking.
00:20:18
Speaker
So it's only 5% of the time, takes effort, it's slow, it's logical thinking, it's lazy, and it's undecisive.
00:20:25
Speaker
That's where doubt is generated.
00:20:26
Speaker
And a lot of times,
00:20:28
Speaker
We need system one to survive.
00:20:30
Speaker
We have so many inputs coming at us just to navigate daily life and to get from point A to point B. If you use system two for everything, you would be paralyzed by analysis and you couldn't do anything.
00:20:41
Speaker
However, when important decisions are influenced by system one, we can be led astray.
00:20:47
Speaker
And that is what we need to recognize.
00:20:49
Speaker
What is system one?
00:20:50
Speaker
What is system two?
00:20:51
Speaker
And when do we have to engage system two to really check our biases and think or rethink what we're about to decide?
00:20:59
Speaker
There's a pyramid of decision approaches that has been popularized in many decision science forums and publications, which really starts with at the bottom of the pyramid is intuitive judgments, things that we can immediately decide with system one.
00:21:14
Speaker
And as you gain expertise in certain areas,
00:21:18
Speaker
you might be able to take more decisions based on intuition.
00:21:21
Speaker
But the problem is that when the situations change, that intuition might lead you astray.
00:21:26
Speaker
You can also create rules and shortcuts.
00:21:28
Speaker
And that might be an example that you basically look at a patient, decide to go to the ICU, you do it on intuition, how the patient looks, or you might have some rules and shortcuts.
00:21:38
Speaker
If the patient has these conditions, they should go to the ICU.
00:21:41
Speaker
If they have these vital signs, they're probably okay to go to another floor.
00:21:45
Speaker
can do importance weighing which is trying to assimilate a little bit more information and making decisions and then finally for really big decisions what you want to do is value analysis and really take the time to think about what's the right thing to do so the same thing here in terms of caring for patients there might be different levels of this decision pyramid that we approach and not everything needs to go to value analysis but for difficult decisions
00:22:09
Speaker
where uncertainty is clearly a significant portion of the possible outcome, I think that we need to engage system two a little bit more in terms of making sure that we are checking our biases and don't go just with our gut feeling.
Heuristics and Cognitive Biases
00:22:25
Speaker
I want to talk a little bit about judgment, heuristics, and biases, which obviously are an inherent part to our decision making.
00:22:33
Speaker
and are very important for us to not only recognize but understand what they are and how they apply to a clinical decision but also other decisions related to critical care.
00:22:43
Speaker
So judgments are the human ability to infer, estimate, and predict the characteristics of unknown events.
00:22:54
Speaker
Judgments are automatic assumptions that individuals might have about aspects of a decision.
00:22:59
Speaker
So you have value judgments, right?
00:23:01
Speaker
So for example, I think
00:23:04
Speaker
I think that ivermectin is a good drug.
00:23:07
Speaker
And you have probabilistic judgments.
00:23:09
Speaker
I think that it is very likely that this patient will die.
00:23:12
Speaker
Those are both judgments that we make all the times when we're making decisions.
00:23:18
Speaker
And recognizing them as judgments is important because we then have to explore a little bit what informs that judgment.
00:23:24
Speaker
Is it reliable data?
00:23:26
Speaker
Is it a heuristic?
00:23:29
Speaker
So what is a heuristic?
00:23:30
Speaker
A heuristic is a mental shortcut that allows people to solve problems and make judgments quickly and efficiently.
00:23:36
Speaker
And a heuristic was involved when I showed you the ball and the bat problem.
00:23:40
Speaker
You immediately made an association and a quick shortcut.
00:23:45
Speaker
And the problem is that many times when we use these heuristics, they can be embedded with systematic biases.
00:23:51
Speaker
And that's what's known as a cognitive bias, which is a systematic error in thinking.
00:23:56
Speaker
that occurs when people are processing and interpreting information in the world around them and affects decisions and judgments that they make.
00:24:04
Speaker
So we all need heuristics to navigate the world.
00:24:07
Speaker
The world's too complex.
00:24:09
Speaker
There's too many inputs at every given time.
00:24:12
Speaker
And without these shortcuts, it would be impossible for us to function and do all the things we do.
00:24:17
Speaker
However, because of these heuristics, there's a price.
00:24:20
Speaker
And that price is that there's a possibility of cognitive bias.
00:24:24
Speaker
Now, bias sometimes can be
00:24:26
Speaker
very minimal and not have a great impact, but bias also can have significant impacts on a lot of our decisions at the bedside and elsewhere.
00:24:36
Speaker
This is an infographic that I found fascinating.
00:24:39
Speaker
You can check it out and explore it.
00:24:42
Speaker
But basically, it's an infographic that shows you, I don't know all, but at least a great majority of all the described cognitive biases.
00:24:51
Speaker
And as you can see from this slide, they are a lot.
00:24:55
Speaker
Not only that there's a whole bunch of biases, but the reality is that without any doubt, 100% of the people listening to this presentation today are full of cognitive biases.
00:25:12
Speaker
They're part of our life every day.
00:25:14
Speaker
And recognizing some of the more common ones when they might actually cause a problem is what we're trying to do as we rethink and become more scientists
00:25:24
Speaker
than preacher, politician, or prosecutors.
00:25:29
Speaker
So these are four of the most common biases that are evident and available in daily life every single moment.
00:25:39
Speaker
There's the anchoring bias, which really is a bias that we have to give more value to information that is presented to us initially or more vividly.
00:25:51
Speaker
So the first thing we hear has more weight than what we think later.
00:25:55
Speaker
And this has been studied, simple experiment.
00:25:58
Speaker
If you give a group of people problem A and you ask them to estimate the result very quickly without doing the actual math versus giving people option B and asking them to estimate the result, consistently over and over again, people in group B will have a higher estimate than people in group A. Why?
00:26:21
Speaker
Because you're anchoring to that first number one.
00:26:25
Speaker
So you're thinking that the result will be lower.
00:26:27
Speaker
You're anchoring to that first number eight.
00:26:30
Speaker
You think the result will be higher.
00:26:32
Speaker
You see anchoring in negotiations.
00:26:35
Speaker
The first number that is thrown out there is a number around what the negotiation is probably going to evolve.
00:26:42
Speaker
You see anchoring, for example, in retail.
00:26:46
Speaker
You might see something for $100 and you're not interested in buying it.
00:26:49
Speaker
Then you see that it was $200, same item, but now it's slashed at 50% at $100.
00:26:55
Speaker
Same amount of money, but because the anchor was different, you're much more likely to buy the 100 discounted item than the same item just for a regular price of 100.
00:27:04
Speaker
And this has been studied over and over again, and we are prone to anchoring.
00:27:09
Speaker
How does it impact clinical care?
00:27:11
Speaker
If I get a call from the ED and they say, blah, blah, blah, patient with COVID-19 pneumonia, blah, blah, blah, then when they come up and maybe they have hypotension or they have a
00:27:24
Speaker
Some other problem, I'm anchored to the COVID-19 diagnosis and I might not be able to think of that this might be just something much more common like ischemic heart disease.
00:27:33
Speaker
So again, think about this all the time in terms of anchoring the anchoring bias.
00:27:39
Speaker
Availability bias is really, really prevalent.
00:27:43
Speaker
Again, we recall or give more credence to what is more vividly available to us.
00:27:49
Speaker
So that usually is what's more dramatic.
00:27:52
Speaker
So for example, and I'll show some examples, but if you were to go and see the movie Jaws when you were young, you would be terrified of getting into the water because of sharks, yet you're much more likely to die from drowning than from a shark attack.
00:28:07
Speaker
But it's very easy to think and recall that shark attacking somebody.
00:28:13
Speaker
And again, what's covered in the news becomes available to everybody and it doesn't represent what's happening in the real world.
00:28:21
Speaker
whatever is more available to us, like a terrorist attack, gives us much more concern than maybe getting in a car and getting in a car accident that's a much more likely cause of death.
00:28:31
Speaker
So availability also happens all the time in the clinical context.
00:28:36
Speaker
And I think one of the greatest examples that I have talked with people about is when we try to gauge the mortality of COVID-19 in our ICUs.
00:28:43
Speaker
People remember the last patient they're coded or the week where a whole bunch of people died.
00:28:48
Speaker
And that is the image that comes up when you think of how many people died.
00:28:51
Speaker
And really the sense was that there's all these people were dying.
00:28:54
Speaker
Now, clearly there's been a lot of people who have died from COVID.
00:28:58
Speaker
But when you look at the percent of people who have died from COVID in the ICUs, it's much lower than what people really felt it was.
00:29:05
Speaker
And that again has to do with the way we're wired and that availability bias.
00:29:09
Speaker
Overconfidence bias is very common.
00:29:12
Speaker
we tend to think that we overestimate our own ability to do something, to successfully extubate a patient, to successfully start a program, to make some changes in the ICU for better flow.
00:29:26
Speaker
And it happens all the time.
00:29:27
Speaker
And we have to find ways to temper that overconfidence because it happens in the business world, in the quality world, but also in the clinical world.
00:29:36
Speaker
And it's something that is, again, very, very prevalent in all arenas.
00:29:42
Speaker
Confirmation bias is the fourth bias that I think is very important.
00:29:45
Speaker
And really it's about believing what confirms what you already believe.
00:29:50
Speaker
So for example, if I really believe that hydroxychloroquine worked, as soon as I saw any article, any study, even with 20 patients that suggested that it did work, I thought that that gave a lot more weight to that study than somebody who believed it didn't work.
00:30:08
Speaker
On the other hand, when a paper came out
00:30:12
Speaker
that showed that in a large database, it doesn't work.
00:30:16
Speaker
Those who believe it didn't work gave a lot of credence to that paper, which was shortly revoked as being fraudulent.
00:30:25
Speaker
So that gave, again, ammunition to those who believed it worked.
00:30:28
Speaker
And because we tend to just gravitate towards what we believe is the same reason why when you look at a specific study, some people think it's a positive study and some people think it's a negative study.
00:30:41
Speaker
And the reality is that the data is the same, but what's different is how that confirmation bias is already playing into the way of thinking.
00:30:49
Speaker
And this has been something that we have seen all along.
00:30:52
Speaker
People who believe that a certain drug works, if they give it and the patient does okay, they think it works great.
00:30:59
Speaker
People who believe it doesn't work, they give it and the patient died, that's confirmation for them that this is a waste of time.
00:31:06
Speaker
And the reality is that none of those two situations could really tell us
00:31:10
Speaker
if that worked or didn't work.
00:31:17
Speaker
So this is a great example.
00:31:19
Speaker
Let's identify the bias.
00:31:21
Speaker
I did six experiments, but only one support our hypothesis.
00:31:25
Speaker
Well, let's only submit the experiment in the grant, that experiment in the grant proposal.
00:31:30
Speaker
What about the other five?
00:31:32
Speaker
So that's typical confirmation bias.
00:31:35
Speaker
It's exactly what we see when people quote a study that supports their view
00:31:39
Speaker
as opposed to really looking at what is available, what is out there, and weighing those different studies to really try to come to a conclusion, regardless of whether it confirms or not what we believe.
00:31:51
Speaker
Another very recent example, we've heard all over the news about the J&J vaccine and possible blood clots.
00:31:58
Speaker
Now, when you look at the risk of those blood clots, if they were caused by the J&J vaccine, it would be 0.0008, six cases and almost 7 million.
00:32:09
Speaker
that is lower than birth control pill, risk of clot, lower than cigarette smoking, and much lower than COVID infection.
00:32:16
Speaker
Yet because of this availability, availability bias, people now are going to be very hesitant to get the J&J vaccine.
Noise and Bias in Decision-Making
00:32:25
Speaker
So again, what we present to people, what's dramatic, what's immediately available, conditions the way we think about decisions, regardless of what the numbers or the data would support.
00:32:43
Speaker
Another important aspect as we navigate through how we think and how we admit judgments, how they're informed by heuristics and biases is what are the sources of error in decisions?
00:32:55
Speaker
And Kahneman and his team really talked about two important sources of error, noise and bias.
00:33:00
Speaker
Noise are mistakes because basically we're not good or don't have the skills or just made a mistake.
00:33:06
Speaker
And bias are systematic errors.
00:33:10
Speaker
that actually are introduced to the decision because of the way we process information, because of the way we think.
00:33:16
Speaker
And a great way of thinking of the difference between noise and bias is to think of an archer.
00:33:22
Speaker
So if the archer, in example A on the left, is very accurate.
00:33:28
Speaker
It can hit the bull's eyes four out of four times.
00:33:33
Speaker
Archer B is noisy.
00:33:35
Speaker
He's all over the place, but there seems to be no pattern.
00:33:37
Speaker
and it did not hit the bullseyes anytime, but he's all over the place.
00:33:42
Speaker
Archer C seems to not be able to hit the bullseyes, but is much more consistent in a certain area, and that's what bias is.
00:33:50
Speaker
It leads you in one direction, but it's still not accurate.
00:33:55
Speaker
And then finally, in the example in D, you have both a noisy and biased archer.
00:34:03
Speaker
So that's the difference between noise and bias.
00:34:05
Speaker
And obviously, you improve noise,
00:34:08
Speaker
or you decrease noise by developing better process or getting better information by learning, by knowing.
00:34:16
Speaker
That's how you decrease noise.
00:34:17
Speaker
But in order to decrease bias, you need to improve your process and how you think about things.
00:34:22
Speaker
So we'll talk a little bit more about that.
Framework for Better Decisions in Critical Care
00:34:29
Speaker
So the last portion of our talk is about a framework for making better decisions.
00:34:34
Speaker
And when I think about decisions, obviously as clinicians,
00:34:38
Speaker
I do believe that the bedside is important.
00:34:40
Speaker
It's very critical, especially what we went through the last 12 months where we have to make decisions with all the good information, but we're also trying to learn.
00:34:49
Speaker
But it also applies to what we are asked to do today as intensivists, which is to create value.
00:34:55
Speaker
In order to create value, we will have to make decisions about change, about new programs, about how to change the way we do things, about innovation.
00:35:04
Speaker
And when we do those decisions,
00:35:06
Speaker
there's always going to be an element of probability.
00:35:09
Speaker
There's uncertainty.
00:35:10
Speaker
So having a process for improving our decision-making, I believe is critical in our journey, not only to become better physicians and be fulfilled with our work and provide our patients with the best care, but also in our journey as creators of value for critically ill patients.
00:35:30
Speaker
The first thing that I want to share with you is
00:35:33
Speaker
a little framework that I believe is very valuable to think like a decision scientist.
00:35:40
Speaker
So we talked about being more of a scientist and a preacher, a politician or prosecutor.
00:35:46
Speaker
But the reality is you also want to follow decision science and have a couple of steps.
00:35:52
Speaker
And for those of you who know me, three steps seems like the perfect number.
00:35:57
Speaker
So I would say that step one is defining, definition, define, identify and mitigate.
00:36:03
Speaker
So we'll see what each one of these means and how they can influence you in making better decisions.
00:36:08
Speaker
The first step is to define the decision problem.
00:36:11
Speaker
So you wanna state your key decision problem.
00:36:13
Speaker
And I think that a big mistake that we do all the time is we fall into these either or fallacies where it's either I do this or that, right?
00:36:23
Speaker
And sometimes you wanna actually, that's a very narrow approach.
00:36:27
Speaker
So what you wanna do is you wanna broaden your approach and really start by defining the goals
00:36:32
Speaker
and the values that you're trying to pursue.
00:36:34
Speaker
So with patient care, a lot of times, obviously it's the best patient care, but there's also other values that might be important in terms of transparency, being very clear, ethically fair, cost efficient.
00:36:49
Speaker
So you want to define your decision problems.
00:36:51
Speaker
What are you trying to achieve?
00:36:53
Speaker
Not from should we do this or that, right?
00:36:56
Speaker
But in terms of what's our main goal?
00:36:59
Speaker
What are some of the values that we want to achieve?
00:37:02
Speaker
And how can we get there?
00:37:04
Speaker
There's also a lot that's been written about identifying the optimal process goal.
00:37:08
Speaker
And the process goal means what is going to drive your approach to decision making.
00:37:14
Speaker
So for example, you can either maximize or minimize.
00:37:21
Speaker
You can maximize accuracy, which is very important for getting the right diagnosis, getting the right treatment.
00:37:26
Speaker
You can maximize transparency, which might be very important in terms of sharing
00:37:31
Speaker
and some data with a team or sharing information with patients.
00:37:36
Speaker
You want to minimize effort.
00:37:38
Speaker
So sometimes you want to simplify some of the steps that we do in the hospital, or you want to minimize emotional strain, which means that it's very easy to make that decision.
00:37:47
Speaker
So, for example, an example of minimizing emotional strain and minimizing effort could be that you set a certain rule of certain things that if this happens, this is what we do as a team.
00:37:58
Speaker
If this happens, this is what I do as an individual.
00:38:00
Speaker
And for some type of decisions, that works very well.
00:38:03
Speaker
For other types of decision, that's not the best process.
00:38:07
Speaker
And you might want to have a process to maximize accuracy.
00:38:10
Speaker
And when trying to maximize accuracy, it's very important to not only get as much information as possible, but also to make sure that you follow other steps to bias your view, to identify your goals, but also to move to the level where you can really do a value analysis of what's going to
00:38:29
Speaker
what's most likely to give you the best outcome.
00:38:34
Speaker
The second step is to identify.
00:38:36
Speaker
And the two things that you need to identify are judgments and biases.
00:38:39
Speaker
So if you are thinking to yourself, or if somebody's talking about something with you as part of a team, you have to identify what are some of the judgments that they're making.
00:38:48
Speaker
So for example, it will take three months to fully recruit for this program is a judgment, right?
00:38:55
Speaker
That might be true or might not be true, but it's important to ask,
00:38:58
Speaker
Are there any biases that are informing that judgment or where does this come from?
00:39:03
Speaker
It might come from the fact that the last time they did it, it took three months, but you might not be accounting that it was different locations or different situations in the pandemic.
00:39:11
Speaker
Another typical judgment might be about cost.
00:39:15
Speaker
We will need X amount of dollars for this project.
00:39:17
Speaker
Now that might be based on a guess, might be guess on wishful thinking, might be based on facts.
00:39:23
Speaker
We don't know, but it's a judgment.
00:39:25
Speaker
Judgment about milestones, right?
00:39:27
Speaker
If you're thinking of buying a house, some people think it's better to buy a house than to keep on renting.
00:39:32
Speaker
Or you could have judgments in terms of milestones about your career.
00:39:35
Speaker
I will have more opportunities if I complete an MBA as a physician.
00:39:39
Speaker
So these are all judgments that might be informed by good information, but we have to be careful because a lot of times if we don't probe a little bit deeper, we might not recognize what are some of the biases that are informing or driving those judgments.
00:39:54
Speaker
And ultimately, that is what we want to do.
00:39:56
Speaker
We want to identify judgments and biases.
00:39:59
Speaker
So for example, we're talking about mortality in the ICU or mortality from COVID, recognize that availability bias.
00:40:07
Speaker
If we're negotiating or we're working on numbers, recognize that anchoring bias.
00:40:11
Speaker
If somebody sends you a patient with a label and you see something that's out of place, recognize that anchor and say, maybe it's narrowing my view of this patient because I'm anchored to that initial diagnosis.
00:40:23
Speaker
Same thing with overconfidence.
00:40:25
Speaker
If I think that I can achieve this in three months, what is that based on?
00:40:29
Speaker
Is that realistic?
00:40:31
Speaker
Poke some holes, create some doubt.
00:40:33
Speaker
That's why we want to identify our judgments.
00:40:35
Speaker
And then with confirmation bias is the same thing.
00:40:37
Speaker
If you already think that you're going to like somebody and then they say something positive, that confirms your bias and you kind of start narrowing your opinion about what could be negative about that person or that situation, that treatment, that decision, whatever it is.
00:40:53
Speaker
Step two is to identify our judgments and biases.
00:40:56
Speaker
And why do we do that?
00:40:57
Speaker
Well, before I go there, I'll just share some other, I think, important biases that I think are relevant to our world in terms of trying to create value in the ICU.
00:41:08
Speaker
You have the halo effect.
00:41:09
Speaker
So the positive impressions of people or brands in one specific area influence additional positive feelings in a different area.
00:41:15
Speaker
So if you trust somebody as being very knowledgeable in some area, they might have an opinion
00:41:21
Speaker
that perhaps is or is not valid that other people appreciate and value more because of that halo effect.
00:41:29
Speaker
The sunk cost fallacy is really very, very prevalent.
00:41:34
Speaker
The desire to follow through on a task or project when they feel they have already invested a lot of resources in it, regardless of benefit cost.
00:41:42
Speaker
When we're trying to get something or do something, we have invested a lot of time and effort in it.
00:41:49
Speaker
It's very hard to really think of the net present value of that intervention or that program because that sunk cost fallacy drives us to keep doing it, keep doing it, even though the best option might be to stop and move on.
00:42:05
Speaker
Affect-related bias, using emotional perception to judge risk.
00:42:09
Speaker
So if we feel positive about something, we will consider it lower risk.
00:42:14
Speaker
If we feel negative about something, we might consider the risk to be higher.
00:42:18
Speaker
And this happens a lot of times, it has happened with COVID, for example, right?
00:42:23
Speaker
We perceived the risk of getting COVID to be super high at the beginning.
00:42:28
Speaker
And the way we felt about procedure intubation in terms of risk was very different of the way perhaps people feel now with PPE and with vaccines.
00:42:36
Speaker
But I think that we have to still be very careful.
00:42:40
Speaker
Status quo bias, very common.
00:42:44
Speaker
When given the choice between actively making a change and leaving things in the default state, people tend to stick to the default state.
00:42:50
Speaker
And that is true for protocols, right?
00:42:53
Speaker
So if anything is pre-checked, it's more likely to happen.
00:42:56
Speaker
It's also true when we're trying to make change.
00:42:59
Speaker
It's true with many, many, many areas in our life.
00:43:02
Speaker
And even though we all recognize that change is prevalent in medicine and accelerating post-COVID,
00:43:11
Speaker
None of us really like the change and we like the status quo.
00:43:14
Speaker
Groupthink, the bandwagon effect, right?
00:43:17
Speaker
As people hear that more and more people have an opinion, they forego their individual evaluation and favor that majority opinion.
00:43:24
Speaker
And we've seen that groupthink with COVID.
00:43:26
Speaker
People say, we got to do this because that's what they're doing in Italy at the beginning.
00:43:30
Speaker
And everybody started kind of going that bandwagon.
00:43:32
Speaker
You have to be very careful.
00:43:34
Speaker
You have to evaluate things for yourself and really try to look at your biases, other people's biases,
00:43:40
Speaker
but also the available information.
00:43:44
Speaker
And then the sunflower bias is a tendency of employees to follow the ideas or opinions of a person in power.
00:43:50
Speaker
So if you are a leader, you have to be very careful with that because the ultimate thing you want is people to always agree with what you say.
00:43:56
Speaker
If people are not questioning what your ideas are, it could be a problem.
00:44:00
Speaker
And the higher the person is in the leadership ladder, the more likely that sunflower bias is to be prevalent when they're making decisions with people around them.
00:44:12
Speaker
So the final portion of this is to mitigate bias.
00:44:16
Speaker
And we have the example here of the archer.
00:44:18
Speaker
You have a biased archer.
00:44:20
Speaker
And there's basically two ways of mitigating bias.
00:44:24
Speaker
You can move the archer, which is changing the individual.
00:44:28
Speaker
So that's the biasing technique.
00:44:30
Speaker
And most of what we're talking about today is going to be that.
00:44:34
Speaker
Or you could move the target and bring it down a little bit to the right, right?
00:44:39
Speaker
And that would change the environment.
00:44:40
Speaker
That's choice architecture.
00:44:42
Speaker
Choice architecture is what we call nudges, and that's a topic for a whole other talk.
00:44:48
Speaker
But just to give it very basically, two examples, one in medicine, one out of medicine.
00:44:52
Speaker
You can change the choice architecture.
00:44:56
Speaker
You can change the choice architecture basically by, in a cafeteria, if you want people to eat healthy food, you can put the healthy food in front of them or very accessible, and that would by default make more people
00:45:13
Speaker
Another way of doing choice architecture would be in a protocol.
00:45:17
Speaker
If you want people to get 30 mls per kilogram for sepsis, you make that pre-checked.
00:45:21
Speaker
So that by itself will increase the number of people who get that bolus.
00:45:26
Speaker
And that has to do because it plays into the status quo bias and other biases that really lead to default mode to be the mode that moves forward.
00:45:35
Speaker
But let's talk more about the biasing techniques.
00:45:38
Speaker
The biasing techniques are about moving the individual or the archer.
00:45:43
Speaker
So for the anchoring bias, an individual anchors on wrong factor or does not sufficiently adjust from that anchor, what you need to do is question the anchor.
00:45:53
Speaker
If it's initial diagnosis, question the diagnosis.
00:45:57
Speaker
Re-anchor, create a broader diagnosis, or consider the opposite.
00:46:01
Speaker
So if somebody is anchored on one extreme, consider the opposite.
00:46:05
Speaker
Availability bias, when you recognize that,
00:46:08
Speaker
Your estimates of probability of an event based on ease or recall or frequency, you should question the source.
00:46:14
Speaker
Ask for additional examples.
00:46:16
Speaker
Use checklists, right?
00:46:17
Speaker
So that's a great example.
00:46:18
Speaker
For example, when we talk about mortality or the risk of something, really try to look at what are other examples.
00:46:26
Speaker
Look at the source.
00:46:27
Speaker
Use a checklist to avoid going through that.
00:46:30
Speaker
And it's another way of de-biasing ourselves in terms of making decisions just based on that availability.
00:46:36
Speaker
Confirmation bias.
00:46:37
Speaker
selectively searching for evidence that confirms our beliefs, consider the opposite.
00:46:42
Speaker
So if you were a big believer that steroids would not work, you should argue to yourself in what situations would it work?
00:46:49
Speaker
What would you need to change your mind?
00:46:51
Speaker
How could they work?
00:46:53
Speaker
And when you consider the opposite, it opens the possibilities, but also it de-biases you by forcing you to think, well, what would it take for me to change my mind?
00:47:02
Speaker
So at the beginning of the pandemic, we had talked about steroids.
00:47:05
Speaker
And I had said that probably based on the literature from SARS and from other viral pneumonias, that we should not be giving steroids to everybody.
00:47:13
Speaker
Then we kind of moved to giving steroids to the sickest people with ARDS based on one study.
00:47:17
Speaker
And then obviously we had several studies in COVID that seemed to indicate that perhaps a broader approach with steroids to those requiring oxygen might improve mortality.
00:47:26
Speaker
So again, I mean, you move if you are able to consider the opposite and reduce that ambiguity by basically saying, what would it take for me to change?
00:47:35
Speaker
And if you have evidence and studies that suggest that your bias was incorrect, that should be something that helps you change.
00:47:44
Speaker
Another de-biasing technique is to take an outside view.
00:47:48
Speaker
So a lot of times we're arguing about something within our group and maybe thinking, well, what would the patient think about what we're talking about?
00:47:56
Speaker
What would the family think about what we're talking about, the changes that we're trying to do?
00:48:00
Speaker
So take an outside view.
00:48:01
Speaker
that kind of maybe extracts some of those biases and allows you to look at it from a different angle.
00:48:07
Speaker
And finally, the overconfidence bias, which is the overestimation of probability of an event, skills, ability to impact the future outcomes, takes credit for past outcomes, or neglects chance.
00:48:18
Speaker
This is very, very common among all of us.
00:48:21
Speaker
So when you're very confident, think of what would the outside people look at.
00:48:27
Speaker
Consider the opposite, or your
00:48:30
Speaker
you're very confident that this is the right thing to do.
00:48:33
Speaker
Maybe you should consider how could it be that the opposite would be the right thing to do.
00:48:38
Speaker
Use an algorithmic assistance.
00:48:40
Speaker
So we have found that people have to make very, experts who have to make decisions are usually better at doing those decisions when they're aided by an AI algorithm.
00:48:49
Speaker
So in some situations using algorithms to enhance your decision, experts decision might be very helpful.
00:48:56
Speaker
And that definitely applies to medicine.
00:48:59
Speaker
As you're trying to do big changes and maybe big decisions, to conduct a pre-mortem might be a great exercise, which I'll share with you very briefly, which is basically the opposite of a post-mortem.
00:49:11
Speaker
So in a post-mortem or in a root cause analysis, something went bad and we try to figure out why.
00:49:16
Speaker
In the pre-mortem, you haven't made that decision yet.
00:49:20
Speaker
So you start by imagining the worst possible outcome.
00:49:23
Speaker
How could this project fail?
00:49:24
Speaker
What were the warning signs?
00:49:26
Speaker
What's our biggest regrets?
00:49:28
Speaker
You come up as a group with the worst things that could happen with this change, and you make it as extensive as possible.
Pre-Mortem Exercise for Risk Mitigation
00:49:33
Speaker
Once you have the worst possible scenario, you perform a risk analysis of which one of these, which is more likely to happen, and what would be the possible consequences.
00:49:43
Speaker
And then what you do as a group is you review and revise, and you use the information that you've created with the pre-mortem to improve your project plan, maybe change the options, or maybe
00:49:55
Speaker
it put steps in place to mitigate the likelihood of those bad outcomes.
00:50:01
Speaker
So this is actually a great exercise to do as a team.
00:50:04
Speaker
It doesn't take a lot of effort.
00:50:05
Speaker
And if you really have a safe team and people can really become very creative, you can really paint a very dim picture, but that can be very useful of that future decision in terms of identifying what is more likely and what are the things that we can do to prevent this.
00:50:21
Speaker
And this you can do also for decisions at the bedside sometimes
00:50:25
Speaker
or decisions that involve your program.
00:50:29
Speaker
Finally, just to close, a couple additional things.
00:50:32
Speaker
We had talked in previous webinars about this idea of sensible medicine, which is basically medicine with good judgment.
00:50:39
Speaker
And you have to kind of combine the skepticism of new evidence with the pace of knowledge translation that we're seeing in COVID and navigate between being a hawk and doing all the new things without evidence
00:50:51
Speaker
and being a nihilist, which means that I want perfect evidence for everything.
00:50:55
Speaker
So obviously we have to make decisions at the bedside and elsewhere without all the good information.
00:51:01
Speaker
So practicing sensible medicine or sensible judgment is ultimately where we want to be.
Sensible Medicine and Timely Decisions
00:51:08
Speaker
And for that, you think again, think like a decision scientist, apply this framework, define, identify, and mitigate.
00:51:15
Speaker
Number two, elevate usual care.
00:51:18
Speaker
If anything, we have learned from this pandemic that the things that are usual care that are evidence proven should be done.
00:51:24
Speaker
We should keep improving them.
00:51:25
Speaker
They do make a difference.
00:51:27
Speaker
Get back to the basics.
00:51:29
Speaker
Number three and final, focus on high quality evidence.
00:51:32
Speaker
So not all evidence is the same.
00:51:34
Speaker
I think that we should recognize that.
00:51:36
Speaker
We should question the evidence, but we should always be focusing on the highest level of evidence that we have available to make those decisions or have that inform us as we move forward.
00:51:47
Speaker
So we talked about COVID-19 and how judgment under these uncertain conditions has been expressed both in society through the amphedema, but also in clinicians, how sometimes people really quickly get into either that preacher, that prosecutor, or that politician mode, but we really want to get
00:52:12
Speaker
to as a scientist mode when we're making decisions that are important, either for our patients or for our programs.
00:52:18
Speaker
We talked about the cognitive process and how we have those two systems that are functioning all the time.
00:52:25
Speaker
And sometimes we need to give system two a little bit more room to interact with system one.
00:52:31
Speaker
We need to recognize that there's a lot of shortcuts we utilize that are helpful in navigating the world, but that can sometimes lead to cognitive biases as we make decisions that might be more relevant.
00:52:42
Speaker
we talked about how we can make better decisions by defining our problems a little bit better in a broader context, by identifying what is the process we're trying to get as an outcome, and by identifying biases that are inherent to those decisions that we're making and trying to mitigate them by thinking again, thinking outside of the box, taking the opposite view, questioning, creating doubt.
Mindset of Doubt and Rethinking
00:53:09
Speaker
And when you pair that with a
00:53:11
Speaker
elevating usual care and we pair that with trying to use the best evidence, I think that we won't be able to make all decisions that are perfect, but I do believe that our decision making will be significantly better and will be much closer to being scientists.
00:53:27
Speaker
So my take home message for today is to think again.
00:53:33
Speaker
Shouldn't be paralyzed by doubt, but you should be always rethinking what you learn because that's the only way to really
00:53:40
Speaker
and become better decision makers and have better judgment.
00:53:44
Speaker
And finally, just want to thank everybody for their time, for all the work they've done at the bedside, and for what's been, I think, a very productive and wonderful week of learning from each other, sharing stories, trying to think again and rethink some of our positions so that we can create the greatest value for our patients and really make a difference for those patients under our care.
00:54:08
Speaker
So I'll stop there and see if there's any questions in the chat.
00:54:13
Speaker
And if not, I want to wish everybody a great weekend and look forward to engaging with everybody again soon.
00:54:23
Speaker
Thank you for listening to Critical Matters, a sound podcast.
00:54:26
Speaker
Make sure to subscribe to Critical Matters on Apple or Google Podcasts and share with your network.
00:54:32
Speaker
Sound's transforming the way critical care is provided in hospitals across the country.
00:54:37
Speaker
To learn more, visit www.soundphysicians.com.