Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Bringing Clarity to Psychiatric Chaos image

Bringing Clarity to Psychiatric Chaos

Doorknob Comments
Avatar
81 Plays3 days ago

In today’s episode of Doorknob Comments, Grant and Fara sit down with Dr. Glenn Saxe to discuss the development and impact of Trauma Systems Therapy (TST), a model that addresses both emotional regulation in traumatized children and the complexities of their social environments. Dr. Saxe highlights how TST’s open-source approach has allowed practitioners worldwide to innovate and adapt the model for diverse settings, from child welfare to refugee care. The conversation explores the importance of honoring intergenerational wisdom while adapting to new challenges like cyberbullying and the digital age. They also touch on the promise of causal data science to advance psychiatric research and the need to embrace complexity rather than oversimplify mental health solutions. 

We hope you enjoy. 

Resources and Links

Doorknob Comments

https://www.doorknobcomments.com/

Dr. Glenn Saxe

https://med.nyu.edu/faculty/glenn-saxe

https://med.nyu.edu/departments-institutes/child-adolescent-psychiatry/trauma-systems-therapy-training-center 

Dr. Fara White

https://www.farawhitemd.com/

Dr. Grant Brenner

https://www.granthbrennermd.com/

https://www.linkedin.com/in/grant-h-brenner-md-dfapa/

If you like our podcast, please leave a review! Thanks in advance!

Recommended
Transcript

Understanding Complex Psychiatric Conditions

00:00:00
Speaker
If a study aspires to explain everything about like schizophrenia or PTSD with three variables or a treatment that targets only two of them says that it can have very large effects, we kind of know on its face if we understand the world that it's how could that be?

Introduction of Hosts and Guests

00:00:25
Speaker
Hello, I'm Dr. Farrah White. And I'm Dr. Grant Brenner. We're psychiatrists and therapists in private practice in New York. We started this podcast in 2019 to draw attention to a phenomenon called the doorknob comment.
00:00:38
Speaker
Doorknob comments are important things we all say from time to time just as we're leaving the office, sometimes literally hand on the doorknob. Doorknob comments happen not only during therapy, but also in everyday life. The point is that sometimes we aren't sure how to express the deeply meaningful things we're feeling, thinking, and experiencing.
00:00:56
Speaker
Maybe we're afraid to bring certain things out into the open or are on the fence about wanting to discuss them. Sometimes we know we've got something we're unsure about sharing and are keeping it to ourselves.

Exploring Doorknob Comments

00:01:06
Speaker
is Sometimes we surprise ourselves by what comes out.
00:01:10
Speaker
Welcome to the Doorknob Comments. I'm Grant Brenner. I'm here with my co-host Farrah White and our guest today is Glenn Sachs. Dr. Sachs is a renowned psychiatrist, researcher, and leader in the field of traumatic stress and child psychiatry.
00:01:23
Speaker
He's best known for his work on the developmental impact of trauma and the treatment of children, adolescents, and families affected by traumatic events. Dr. Sachs is the founding director

Dr. Sachs and Trauma Systems Therapy

00:01:32
Speaker
of the Center for Child Trauma and Resilience at and NYU Langone Health and previously served as chair of the Department of Child and Adolescent Psychiatry at and NYU School of Medicine.
00:01:42
Speaker
He's known for trauma systems therapy and his work in computational psychiatry. Dr. Sachs, you have so many accomplishments. Maybe you could fill us in on some more of the highlights by way of introduction and welcome.
00:01:55
Speaker
Well, thank you. It's great to be here. Thanks for inviting me. i think you did it pretty well. um I guess the only thing I'd say is I'm very proud to develop trauma systems therapy, which is a treatment used in a lot of different places for kids and families under a lot of adversity. um I'm also um really happy to focus on computational psychiatry. I think it's it's really the future, especially an area of it called causal data science. And
00:02:28
Speaker
Recently, with my team, we were funded by NICHD to establish a center on causal data science for kids and families with maltreatment and called the CHAMP Center.
00:02:42
Speaker
What does CHAMP stand for? um It's the Center for Causal Data Science on CHAMP is Child and Adolescent Maltreatment Prevention.
00:02:53
Speaker
Thank you.

Trauma Therapy at Boston Medical Center

00:02:54
Speaker
So... What was um you know the first moment that, because I am really curious about how the trauma systems therapy came to be.
00:03:06
Speaker
um and could you talk to us a little bit about what you saw in the field before you developed it and how you sort of see it being used? Yeah. So, you know, I approach just about everything as a clinician.
00:03:23
Speaker
I'm, you know, I had good clinical training. i see the world and its complexity and what families are facing. And now this is, traumatism therapy began to be developed 1999, 2000. I was chief of psychiatry at Boston Medical Center, which is like Bellevue Hospital in Boston.
00:03:45
Speaker
And it was clear that Kids and families, or it wasn't post-traumatic, it was peri-traumatic. There, you know, all kinds of adversities. You'd be doing your best with evidence-based treatment in an office, and they'd go home and come back in worse shape. And so with my team, we wanted to figure out how to do it.
00:04:12
Speaker
differently, how to address the social environment in ah rigorous way. And that was the um the core of this. And then we've been on a ride since that time. Our first book was published, Note 5. And it's being used a lot. And I'm really happy that people have picked up some of its core ideas.
00:04:36
Speaker
What are some of the core ideas? Well, the core idea is um everything's related to what we call a trauma system, hence trauma systems therapy, which is ah about two things.
00:04:50
Speaker
One, a traumatized child who can't regulate their emotional states. Two, they live within the social environment that isn't sufficiently able to help them with that regulation.
00:05:04
Speaker
And it

Innovation and Intellectual Property in Therapy

00:05:05
Speaker
gets way more complex, but it's built on top of those two things. In essence, what trauma sets up is the situation where kids get triggered from all kinds of events in their social environment.
00:05:22
Speaker
They get horribly dysregulated. People don't know why they're getting dysregulated, but there's patterns of it. And so you... You work to help understand and manage what stimulates the dysregulation, and then you help build the child's internal capacity to regulate when they're stimulated like that. that's In essence, it gets ah way more complex. ah you know Part of it is you have to be set up to understand that every kid and family is different.
00:05:58
Speaker
What dysregulates them are very different. And so you have to be ready to assess and understand. Terrific. I mean, it sounds like this should be and and could be used almost anywhere at any developmental stage.
00:06:14
Speaker
Is there something specific that you had in mind when you were creating it? or Well, I created it for an inner city hospital in Boston, and I thought it would be a mental health model, that sort of thing.
00:06:29
Speaker
But what surprised me actually is people began picking it up. People started in the child welfare system thought this could be useful, the residential system, a system related to substance abuse, refugees. So it it kept being picked up. um i did And one thing that people told me I was crazy to do when we developed this, but it was the best thing I ever did with this.
00:07:00
Speaker
I didn't claim intellectual property at all. And what that meant is there were core ideas that people could innovate on top of based on their knowledge of the world.
00:07:14
Speaker
I was very um influenced by a book by Eric von Hippel called Democratizing Innovation. He and I met with him as we were designing this. He ran innovation at MIT and um his idea was the open source movement.
00:07:33
Speaker
You have platforms of innovation, but then you give lead users the ability to make it work for them. And so TST became a platform of innovation.
00:07:46
Speaker
I mean, i wasn't working in residential care. I didn't, you know, but people adapted it. The idea then is that ah In the community, people freely reveal their innovations that then get picked up by people who have similar problems.
00:08:04
Speaker
You know, it was picked up by a program in rural Kansas. What do I know about this health system in rural Kansas? All of that. but But part of it is, you know, and if you read our second book, it's radically different than our first book. Why?
00:08:20
Speaker
because so many innovations got subsumed into TST. And, you know, what I think there's a real problem with like intellectual property

Open Source Models in Therapy Development

00:08:32
Speaker
about these sort of things.
00:08:34
Speaker
What ends up happening is you get a treatment then stuck in time whenever you developed it with no no ability to change, improve, adapt to the world based on users' needs.
00:08:50
Speaker
It's like an open source model And I can imagine, you know, I've been um i've been playing with some AI platforms where you can you can tell it what you want, like you want a role-playing bot or you want an app.
00:09:04
Speaker
And then you can give it parameters that it guides the underlying AI engine. You could select from a number of different AI engines. um And you can load it up with specific data. So you can load a textbook or a bunch of textbooks or whatnot And then, you know, you can operationalize it. You don't need to know how to code because it does the coding for you.
00:09:26
Speaker
And so that is kind of the ultimate flexible model. Like we could upload trauma systems therapy and, you know, then people can access it. But that's innovative because, you know, a lot of times in academia and in commercial interests, people try to maintain intellectual property ownership. Yeah. and it prevents sort of the most number of people from being helped. So,
00:09:50
Speaker
Yeah, I mean, isn't her there isn't emergence beautiful? what yeah What emerges out of the complexity is something you would never have predicted. Right, like things like locking down the ownership of the intellectual property prevents complex systems from evolving naturally, right, or emerging.
00:10:11
Speaker
That was our our experience for sure. And, you know, we've it's been quite a ride, but but we have people all over world the country and different countries who are using it, making it work for them.
00:10:24
Speaker
We have a monthly innovation community meeting where different people share their ideas and all of that. It's become quite a platform and and part of it is recognizing the limitations of expertise.
00:10:40
Speaker
I know some things, but there's a hell of a lot I don't know. And why would I expect to know everything for what people need in all kinds of systems, in all kinds of places?
00:10:52
Speaker
I mean,

Intergenerational Trauma and Wisdom

00:10:53
Speaker
you know, there's an arrogance to that. And, you know, opening things up, allowing people to make it work for them creates unexpected great things.
00:11:06
Speaker
I think that's a wonderful point. And ah sometimes when I'm going in to do research or look at something, or I'd like to learn a little bit more about one particular ah modality, I don't really do sort of manualized treatments in my practice, but I have noted some um some of these theories and i guess I don't want to specify, but there are waiting lists, um, for therapists to be trained in, in certain types of therapy.
00:11:41
Speaker
And I do think about how much need there is out there. And I have thought to myself, wow, I wonder, you know, how ethical it is. i don't know. I'm not a researcher. I'm not developing anything. I'm just implementing things as much as I can, you know, to help as many people as I can, which is kind of a drop in the bucket compared to, you know, these big centers. But that is something that I've come up against and I've thought a lot about it, but I haven't read um the book that was so influential to you.
00:12:12
Speaker
So I'll check it out for sure. Yeah. I mean, part of it is in the field, if we understand human beings are really complex and then the environments humans inhabit are awfully complex, we have to make sure that, you know whatever we do is true to that complexity.
00:12:35
Speaker
or or we miss enormous things. So if a study aspires to explain everything about like schizophrenia or PTSD with three variables, or a treatment that targets only two of them says that it can have very large effects, we kind of know on its face, if we understand the world, that it's, how could that be?
00:13:03
Speaker
Well, I think complexity is tricky for people. Albert Siengiorgi, who's a Nobel laureate, he wrote a kind of a manifesto that was published in 1970 called The Crazy Ape.
00:13:16
Speaker
And he was really talking about anti-nuclear war and politics. I know our we were introduced by ah colleague, Fred Stoddard, who was very involved in anti-nukes. And Sven-Georgi talks about um how we face this modern world with what he calls a caveman's brain, which is problematic.
00:13:37
Speaker
And so, you know, a lot of your work has been to develop specific ways of capturing complexity and also sifting through the noise to find like what is the signal, like what really works.
00:13:51
Speaker
Yeah. And that process you described of kind of let's test two medications and see what happens, or let's test one factor, that gets really tricky. And I know you've done a lot of work in in causal modeling.
00:14:05
Speaker
But before we get into that, Farrah and I were talking earlier, and She had talked about asking you about resilience um and and in light of your work with trauma systems therapy.
00:14:18
Speaker
And i'm I'm thinking about intergenerational transmission of trauma, how that plays out over the generations, which you know, in some ways is more complex, but in some ways for me is simplifying.
00:14:29
Speaker
When you see a pattern that just clearly marches from generation to generation, it kind of comes into focus much more clearly sometimes than looking at, you know, one generation. So actually what I wanted to ask you about was kind of the flip side of that.
00:14:44
Speaker
And I don't think this is a real concept. It's me making something up. um Intergenerational transmission of wisdom. um or healing. And I'm wondering how you think about that, what that would mean to you.
00:14:57
Speaker
Now, that's a great question. um i I think, you know, I've i thought of that in a completely different context than psychotherapy, all of that. but But I think that's, that you know, the the idea of wisdom and what gets transmitted over the ages.
00:15:18
Speaker
And the trash heap of history is filled with wisdom that fell away. But then there's wisdom that stayed over hundreds and thousands of years.
00:15:32
Speaker
And i think we

Building Resilience in Children

00:15:34
Speaker
ignore that at our own peril. doesn't mean that just because something came to us over a long period of time, it's necessarily good.
00:15:45
Speaker
But we have to take that information that it's lasted as something that's deeply meaningful. Yeah, and I think that's a good point.
00:15:56
Speaker
ah And I do wonder if things have changed so quickly in today's world. I'm just, I don't know why it's coming to mind, but this idea of bullying.
00:16:09
Speaker
Right. And I don't know if people have sort of been able to trend it. um i know it's a major risk factor for a lot of, you know, psychiatric disorders and depression, anxiety.
00:16:26
Speaker
and I wonder if with sort of the explosion of the Internet and social media, if we're finding that we almost need to translate the wisdom of previous generations into um something new. And I wonder if people have taken trauma systems therapy and sort of applied it to some of those principles. I don't know.
00:16:52
Speaker
Yeah, I mean, there's a few things I'm thinking about. um Obviously, there's risk to bullying, and and we want to help people with that.
00:17:04
Speaker
But then there's also the expectation that childhood may go on with little stress, and we should protect kids from stress.
00:17:15
Speaker
And if a kid is feeling intimidated by another kid, we have to protect them from that there's something about helping kids learn how to manage problems how to manage discomfort all of that and and that's was known throughout the ages you know the wisdom we may be talking about is wisdom of managing the crap of life and which which
00:17:49
Speaker
We can't eliminate that no matter how hard we try. But when we eliminate, kids ah need to deal with it and figure it out.
00:18:03
Speaker
we We rob kids of certain enormously important resiliency capacities. There's a concept of post-traumatic growth, which tends to occur with moderate levels of post-traumatic stress.
00:18:17
Speaker
And bullying, by definition, crosses some line. It's kind of a systematic um set of attacks, which might be contrasted with kind of the the way kids are. Not all kids are nice. And if you're overprotective, right, like the helicopter parent problem or the age of anxiety, then kids...
00:18:38
Speaker
you know, don't develop those muscles.

Historical Lessons and Modern Challenges

00:18:40
Speaker
It's like if you sit home all day and stare at a computer screen and you never get outside and run around, you won't you won't develop physical fitness. So it's it's a tricky question.
00:18:50
Speaker
However, I would say in terms of the wisdom of the ages, I don't know that um any prior age has done a better job dealing with kind of people's inhumanity toward one another. And I think in a very significant way, and this is getting a little bit far afield, but baby speaks to current events.
00:19:08
Speaker
the ethical dictum might makes right still is one of the leading things that guides the world. um If you're better or stronger in some area, then you're likely to prevail.
00:19:22
Speaker
Yeah, again, just because something may come to us from the ages doesn't necessarily mean it's good. um but But there's, so again, and then so I'm not supporting that, obviously. There's been a lot of harm caused by that.
00:19:39
Speaker
um It's just, you know, we also have a culture that tends to be ahistorical, that feels like Everything true, everything I need to know is about now in my very, very, very, very very small circle compared to the circle of all people over all time, you know, and what might be learned. And um that's what I worry about.
00:20:10
Speaker
Yeah, things are kind of on the surface, but there's a greater breadth of information, which I think is part of the problem. Like there's a way where people nowadays know a lot more about a little. Yeah.
00:20:21
Speaker
And then they they think they really understand, but you don't know the history. Like you said, you don't know the story. You don't know how you arrived at that point. You don't know the depth behind it. And I think the meaning gets thinned out as well.
00:20:33
Speaker
Yeah. Yeah, I mean, wisdom from culture, from religion, all of that about how people work together and and all of that. Again, not everything's good. That's obvious. But we we we we kind of flippantly dismiss old-fashioned ideas out of our own peril.

Causal Data Science in Psychiatry

00:20:59
Speaker
So how does causal data science rescue us from this problem, these problems? and And maybe we can start with with the problem that you've talked about of the limitations of the current psychiatric diagnostic system, which is you know hotly debated on social media.
00:21:21
Speaker
You've got pros, cons, and everyone in the middle fighting it out. Yeah. Yeah. So, I mean, the the first thing is, what is a cause?
00:21:33
Speaker
A cause is something that if I intervene on it, will change something else. um If I want to intervene to change something else and it's not a cause of something else, I could do it for 100 years and it won't change something else. Okay. Okay.
00:21:50
Speaker
that's a little I just want to point out that that's a little bit different maybe than the commonplace definition of a cause, which is more like an antecedent, something that happened in the past that like set the situation up.
00:22:03
Speaker
But it's not the thing that is keeping it going now, whereas if you intervene, it will change it. Well, there's many, many things antecedent to other things that are statistically associated with those other things, but they're not causal of it.
00:22:20
Speaker
And if I want to change any of those things, they're not going to change what I want to see change. Yep, there's a joke in physics, like, um you know, what is the source of human happiness or something? And it's the laws of physics and the conditions of the universe at the Big Bang is always the answer.
00:22:40
Speaker
But you're saying like, you have to look at what's happening something like in the here and now. Well, not necessarily. I mean, you have to intervene on factors in the here and now, but things before can be causal of things now.
00:22:54
Speaker
Things before can be causal of the things that are causal, the proximal and distal causation, all of that. But the bottom line is,
00:23:05
Speaker
Is it a cause or not? And if it's a cause, how big an effect is it going to be? Because there may be causes that if you change, will only change things a little bit. But let's let's stick with cause or no cause.
00:23:20
Speaker
So ah again, and you're talking about diagnoses in in psychiatry, um
00:23:30
Speaker
it's a It's either a cause or it's not. If I intervene on a non-cause, it's not going to change anything. So then let's bring it into the clinic. okay um How do I know about how I might want to intervene with the patient?
00:23:47
Speaker
It's because I use their clinical data to categorize them into ah category. of risk and probable response to an intervention if they if I believe they're at risk.
00:24:02
Speaker
So the only way I could know if they would be probably responsive to an intervention, if I knew that intervention targeted the cause. okay So classification has to be about risk and about probable response intervention. That's how it works in medicine.
00:24:23
Speaker
that's how can you Can you give an example of a cause and effect? like something which is a cause, and then if you target it, it gets better just for listeners? ah You know, something like, um say, a heart attack, you know, myocardial muscle damage, which is related to, you know, the coronary artery narrowing, that sort of thing, leads to all kinds of morbidity and mortality problems, congestive heart failure, arrhythmias, death, all of that.
00:24:55
Speaker
So, Causal, all of it. So can I, you know reduce, increase coronary blood flow? Can I do surgery? Could I, all of those things. That's simply true.
00:25:10
Speaker
Diagnosis in medicine is looking at causes you can intervene on. about in like psychiatry? I know you've done some work on PTSD or you could talk about depression. Just is there anything that you can identify there that would help illustrate the principle?
00:25:28
Speaker
Well, but but part of it is this. Again, if if I have a diet, let's go with DSM. So if I have a diagnostic system based on symptoms and I don't know about causes,
00:25:44
Speaker
then I'm using that to classify people into categories of risk and probable response to intervention. But if I do an intervention based on that category, by definition, especially if we believe the disorder is multi-causal, they're going to be mixed. you know um I don't know how many of them that cause is relevant.
00:26:12
Speaker
So I intervene with whatever I intervene on, and it's only going to change anything if it happens to change a cause amongst a set of people in that big category of people.
00:26:29
Speaker
You know, if we believe, most believe, that any major mental disorder is has many, many causes. So think of what that entails.
00:26:42
Speaker
It entails, let's say, major depression. We categorize many, many, many people in one category. We've just admitted that category is vastly multi-causal.
00:26:57
Speaker
So what what can we expect? Do you think that's part of why depression treatment isn't as robust? That's exactly why it's not robust.
00:27:08
Speaker
Right, even with therapy has a pretty strong effect, but it's kind of trial and error. You try different things. um Or you look at newer treatments like TMS,
00:27:19
Speaker
which stimulates the frontal cortex of the brain. And it it doesn't have 100% response rate, but you see patients having 60 80% remission.
00:27:30
Speaker
and It's not using data science, I think, the way the way you do, but it's it suggests that we may have identified a cause

Clinical Practice and Research Translation

00:27:40
Speaker
when you see a consistent positive response to the same intervention.
00:27:44
Speaker
and And in that case, cause would be like low activity in some brain network. so let's So, right. That's exactly the question that you want to know. I mean...
00:27:56
Speaker
Your main question is, how can I identify the people who would be responsive to that intervention? So then use the best information you possibly can. If there's a pattern of brain imaging findings that completely distinguish that group from everyone else, that should be your diagnostic criteria.
00:28:20
Speaker
That's what the diagnosis is for. That would be like a future state with TMS. They're doing that with PTSD. There are different patterns. With depression, it's not as clear, but that gets into the complexity that you're talking about.
00:28:35
Speaker
Farah, how do you think about psychiatric diagnosis? um I just want to ask as a clinician sort of on the ground, like how you think about it. And because I think part of what Glenn is saying though you know you're you're being measured, you know that you you really have some critiques of of the way we think about it now because of the amount of um sort of harm it does leaving people without treatments that could be better, right?
00:29:04
Speaker
Well, if I think about, think of any medical field that has had a ton of progress. Think of breast cancer, think of cardiac disease, all of that.
00:29:18
Speaker
Think of the diagnostic criteria for those disorders and say, i don't know, ah the top of my head, 1980, okay? okay Now think of the diagnostic criteria today.
00:29:32
Speaker
It's radically different. Why? Because the fields have put their most promising ah causal findings in their diagnostic criteria. They kept updating and updating and updating it because the only way...
00:29:50
Speaker
An intervention can change an outcome if it targets a cause. okay Now look at major depression. Look at schizophrenia. 1980 with DSM-3 and today.
00:30:03
Speaker
you know Maybe it could find a little difference. look at our Look at the tens of thousands of articles published okay on the etiology, risk facts, protective associates, all of that.
00:30:20
Speaker
Where are they? You know, are have if if we're touting their treatment potential, if we're touting their promise for patient care, why have we not decided they're so promising that we'd use them to classify patients?
00:30:40
Speaker
Isn't that, you know, i think that's mind boggling. I agree. um i don't know necessarily what we what we can do to change it. I think that's why we're so interested in talking to you and in hearing about what you're doing.
00:30:58
Speaker
I do feel really fortunate that I'm not beholden to the DSM or that it doesn't ah have to define how I work, but you're right. It would be lovely to have.
00:31:12
Speaker
and I think Grant, you do a lot more of this with scales and in in your practice, but I see diagnosis causality and those types of things as really dynamic Yeah. ah yeah A good clinician sees them as very dynamic. you want to one First, you want to personalize your intervention to those particular causal factors that would be most relevant for a patient. so that's one thing.
00:31:45
Speaker
And then you want to, it's it's what clinical skill is all about. It's about like in the moment, looking at what's causing someone to do certain things. um A patient missed your last appointment, now they're back.
00:32:02
Speaker
So you're worried that they're maybe they've been upset with you, all of that. So you're thinking, What might I say? Will that make them be more likely to engage, less likely to engage? Well, you're you're you're predicting under intervention conditions because whatever you might say is an intervention.
00:32:25
Speaker
And that is the definition of causality, predicting under intervention conditions. And engagement might be a causal factor in therapy. Like the therapeutic alliance is one of the few things that's been identified as being very important for outcomes and whether or not you can engage the person meaningfully is likely to be causal.
00:32:48
Speaker
it Absolutely. Absolutely. So a good clinician, look, this is the nature of expertise. we We have models in our head. I know you interviewed Carl Fristen not long ago, and and it's how the human beings work. their and you know Any human being, and but an expert therapist has models in their head that guide their actions.
00:33:13
Speaker
An interpretation is an action. Anything you say could be an action. Your body movement, language all of that are actions that have consequences. And you are moment by moment.
00:33:27
Speaker
appraising unconsciously often, sometimes consciously, the effects of your action, you're making predictions, all of that. So see that that's good clinical care, good clinical skill.
00:33:42
Speaker
The main question is, how could science help us? So let's look under the hood a little bit. I've read some of your papers and I've read a lot of Carl's work. And we talked to Mike Levin recently, ah the biologist at Tufts, the developmental and synthetic biologist at Tufts.
00:34:01
Speaker
And it's very hard to kind of translate it. You know, i have... a little bit of math and physics background, not not anything like folks folks like you and Carl and and and Mike, but you know enough to kind of follow the reasoning.
00:34:16
Speaker
But the kind of techniques that you're using seem to me to be very powerful mathematical and statistical tools for actually laying bare what is causal.
00:34:30
Speaker
yeah And so I'm wondering if we could try to talk about that a little bit without being too mystifying. Okay, so let's just acknowledge a certain reality. The way we were trained to infer causes is, for all for ah practical purposes, impossible. So we we can't really do ideology ideological experiments.
00:34:56
Speaker
hey we We don't have our our, you know, if we could, our scientific literature would be full of these sorts of experiments on certain facts.
00:35:08
Speaker
We can't do it. So we may then say, well, since we can't do it, we can't know causes. We then shouldn't have ah diagnostic classification system based on causes because it's kind of impossible. Right. But but it's let's say it's also harder in psychiatry because if to take your example of like a heart attack, you could study a heart attack, you know, in a mouse, right? And it's going to translate to people in a lot of ways, so but it's much harder to figure out human experience, right? You can't run those experiments directly.
00:35:42
Speaker
no exactly. So, you know, it's like 100% what you just said. so you know We have less access, not only to human etiological experiments, but the animal models of human psychopathology are way less approximate than in in many medical fields. So the field has a problem.
00:36:09
Speaker
We have observational data. hey that's Pick up any journal about... a brain region, about a gene, about a molecule, ah any it's observational.
00:36:23
Speaker
It's kind of like astronomy.
00:36:26
Speaker
Yeah, astronomy, yeah. um And so far right so part of it then is if this is the data that we have, what are we going to do? Because we if we don't know causes, we're kind of at sea about how to intervene.
00:36:46
Speaker
So the first thing to think about is why
00:36:51
Speaker
our literature that's mainly observational is untrustworthy. Here's why. Because of what's called confounding, which is a common cause. Let's say I study a brain region, a a molecule with a strong statistical association to any mental health outcome.
00:37:15
Speaker
Okay? And I feel, oh, this is great, all of that. Well, If there happened to be a common cause, some other variable that was causal of both, you would get you would observe a statistical association, which is meaningless for intervention, that it's confounded. That's what confounding is.
00:37:40
Speaker
Like they say, heart correlation is not causation. And it's because of this. So let's say the field believes that mental disorders are multi-causal.
00:37:52
Speaker
Well, what the field believes with that is that any of those causes that weren't a factor in the study may be confounding the factor in the study.
00:38:06
Speaker
okay So if you if you have a brain region that you're studying with an outcome, and another brain region causes its activity in the outcome, or a gene causes its activity in the outcome, or a social factor causes its activity in the outcome, you have confounding.

Causal Data Science Techniques

00:38:26
Speaker
To safeguard against confounding, You have to, with with hypothesis-driven statistical methods we use, you're going to have to make a hypothesis about which one of those causes can plausibly affect the factor I'm hypothesizing in the study and the outcome so I could control for that.
00:38:50
Speaker
But that's really impossible. And so we need to look at other methods, and that's how I... ah began my journey with causal data science methods.
00:39:05
Speaker
Farah, so we're talking about like using very advanced methodologies that Glenn is going to tell us a little bit about, but we're we're not really going to understand.
00:39:16
Speaker
i see pictures in my head of Markov chains and Bayes' theorem. It's, you know, listeners are going to have to do a little bit of work if they want to understand it more deeply. um But that fundamentally...
00:39:29
Speaker
by looking at complex data sets and using some of these techniques that ah Glenn is using in his lab, you can identify what are actually the causal factors and screen out the noise of the confounding factors, right?
00:39:46
Speaker
And not all of those things are going to be interventions, right? One might be like, a genetic factor that unless you use CRISPR or something, you probably can't change it. But another might be an environmental factor or a biological factor you can modify.
00:40:01
Speaker
i kind of want to ask you, Farah, like, because you're very good at sleuthing out causality. You know, you're highly intuitive. Like you do it quite well without, I think, using any algorithms that we've written down.
00:40:15
Speaker
I'm just curious where you are with this now. My patients call me witchy.
00:40:22
Speaker
I we're, you know, onto something there. yeah Um, and mostly i think what I'm trying to do, even though I'm not technically and analyst, right. But I've studied a lot of analytic theory.
00:40:38
Speaker
I wouldn't call myself a psychopharmacologist, but I've studied, um a lot of psychopharm and i I, use it every day, But I'm just trying to understand as much of their experience as I possibly can. And i what I love about this field is that even my own personal experience as a child, as a mother, whatever else, um I feel like informs a lot of that um and things that I had learned
00:41:12
Speaker
really never thought about. So kind of looking at how to treat patients through a certain lens has has illuminated a lot of parts of my personal life too. Sorry, go ahead. yeah And, you know, as as a good clinician, one, you have a theory in your mind, could be, you know, from your background growing up, integrated with psychoanalytic theory or whatever models you're using. But you're testing your theory all the time.
00:41:44
Speaker
And the evidence of whether you're onto something or not is accurate predictions under intervention conditions. So if if you're going to be in the position of learning, then you know when something happens that you don't expect, you're going to wonder why and think about what you did and so then do it a little differently. And so a good clinician is continually updating their model of the world based on their experience with
00:42:18
Speaker
with with their patients, and and and that's how it works. what what was One, i think that's like the foundation of everything is good clinical care, and our science serves good clinical care. So what would add value to you from the science that When was the last time you read ah scientific article that changed what you did in practice?
00:42:49
Speaker
There are thousands and thousands of them. um that is the That is the bar about the value of a science.
00:43:00
Speaker
Yeah, i'm I'm more worried about the ah sort of the opposite is these little articles come out about like this medication was used in 100 people. And then someone tries prescribing that medication, you know, maybe it will help.
00:43:14
Speaker
But it's not really clear whether it translates at all. So, okay, can you dip down just for a couple of minutes into the world of math? And talk about how you actually study systems. I i was very impressed by the study of of police trainees, who gets PTSD as an example. but yeah um So, i yeah, and I want to i I actually, there's a huge, you know, it's very technical when you really get into causal data science, but there's some basic principles that
00:43:51
Speaker
that help just about anyone gather, and I'll try and explain that to you as best I can. So first is what makes us so confident about randomized experiments?
00:44:04
Speaker
What makes us so confident, you know, it's that confounds, are equally distributed between the groups so we don't have to worry about them. They're randomly distributed.
00:44:17
Speaker
Okay. So then if we manipulate a factor, we know the dependency of the outcome on that factor can be inferred without worrying about confounding, which would confound bias or estimates of effect. that That's technically the word here. It cancels out the noise, kind of.
00:44:40
Speaker
Yeah. So if we are able to estimate the effect without worrying about confounding, it gives us similar knowledge about experiments.
00:44:53
Speaker
And that's what these methods do through what's called conditional independency. okay We usually think of conditional dependency, you know what conditions makes one variable depend on another variable.
00:45:09
Speaker
Conditional independency is identify a set of factors given an outcome that would render everything else independent of that outcome.
00:45:21
Speaker
Okay? That's what we do when we statistically control variables. It's kind of like dissecting out what you want, right? Right. Well, that it's exactly what it is, but that' that's what that what causal inference is.
00:45:37
Speaker
okay That's what causal inference is. We come up with methods like experiments that allow us to exclude everything else so we can really estimate the effect of one thing. So there are strong methods to...
00:45:58
Speaker
infer statistical conditional independency very very strong methods and basically you would mention the markov boundary in essence markov boundary is the direct causes the direct effects and the direct causes of the direct effects of any outcome of interest think of any outcome you want direct causes direct effect It's mathematically proven that if you know the Markov boundary, you don't everything else is independent of the outcome.
00:46:35
Speaker
ka So without getting into a lot of detail about how these methods do it and the assumptions, there are certain assumptions are testable, all of that, if you...
00:46:47
Speaker
These algorithms allow you to know the Markov boundary and what you could do once you know that, it includes the direct causes.
00:46:58
Speaker
So you could work your way up. So what the causes of a variable in the Markov boundary that's ah like directly causal? Then you could get at indirect causes working your way up.
00:47:13
Speaker
So you could build variables you could understand the causal structure. And then if you've ruled out confounding, and there's methods where you could detect signatures of latent confounding from unmeasured variables.

Testing Causal Assumptions in Clinical Trials

00:47:29
Speaker
So if you've ruled out confounding from anything you've measured and anything you haven't, then your estimate of effect is unbiased.
00:47:41
Speaker
And, ah you know, i that maybe you know that that's the essence of what it is. Now, again, if people are skeptical of this, everyone we should be skeptical.
00:47:55
Speaker
We have to compare it. to what we're using now. What we're using now are the studies all over our journals, which are observational, which are hypothesis-driven, conventional statistical methods. so And there we have not ruled out confounding, far, far, far from it.
00:48:18
Speaker
Yeah, for people who are interested, we'll put some links in the show notes that talk about the Markov boundary. It's really, really interesting to get into the details of it, the way these causal chains are set up, and how you can look at all the different, it kind of looks like a network with lots of different nodes, and And you identify which of those nodes are the key players in the causal sequences.
00:48:49
Speaker
And it relates to a lot of cool stuff about like agent-based modeling. And there's AIs based on this stuff as well. and And then here's the missing link to.
00:49:01
Speaker
It's testing this in reality. So this... gives you intervention targets you could be way more confident about. So now you can conduct clinical trials on those.
00:49:14
Speaker
I mean, that's that's in fact what we're doing with my CHAMP Center grant. We're building causal models, ah making decision support tools on different models which tell you how to intervene given clinical data.
00:49:27
Speaker
And then doing clinical trials so on that. So that's how you test it in the world to see. And maybe things don't work out the way your model told you. Well, you can go back and try and figure out what was wrong. Maybe measurement was wrong. Maybe your data set was wrong. All of that. But now you're in the world of science, actually.

Impact of Research on Real-World Interventions

00:49:52
Speaker
um and right And, um you know, that's how it should be. So we if in our literature now, what factors from the study, from from our correlations, are we most confident about to take into the world and test with intervention? you know Can I ask you about the police trainee study yeah ah quickly? Yeah.
00:50:19
Speaker
Okay, correct me if i'm if I'm mangling this really badly, but you you know looked at a lot of different factors and used data science, comp computational modeling, to look at what causes PTSD among police trainees.
00:50:33
Speaker
And there were two genetic risk factors related to stress genes. There was the intensity of their startle response, I believe. There was the presence of peritraumatic dissociation, which is...
00:50:47
Speaker
also one of the risk factors for the development of PTSD um in ah in observational studies. And there was one related to, I think, training parameters. Was that the fifth factor? ah ah Yeah, that's part of Yeah.
00:51:01
Speaker
Okay. So what I wanted to ask you about, I've been dying to ask you about this, is the trauma wasn't one of the causal factors. Yep. So here's a way of thinking of it.
00:51:13
Speaker
First of all, the model we published you know, that rules out all confounding. Okay, so you you you might have said, well, the trauma, but you may have also said that there were many other variables that were not related to.
00:51:30
Speaker
ah Many of them have been, you know, published in literature, but but again, they were confounded by other variables. The other thing is in causal chains that are more proximal or distal,
00:51:44
Speaker
We only looked, I think it was two steps up. So you have PTSD, you have the direct causes, that's the unmediated causes, and then you have the causes of those causes, right?
00:51:57
Speaker
So we only looked two levels up. So there were many variables also on more distal causal chains, which have very have small effects. I mean, basically through Markovian processes, if you know the more proximal one, then it renders everything else conditionally independent of the outcome, that sort of thing. So it's part of that. So, you know, trauma may have been more distal, but its effect is rendered, its effect is related to what it sets up more proximally.
00:52:36
Speaker
That makes sense. The other things depend on it. It makes sense clinically because, you know, a lot of people experience trauma, but a very small percentage of people actually get PTSD.
00:52:48
Speaker
Yeah. And so if you, let's say you knew trauma affected certain patterns of emotional dysregulation. So if you or intervene on the emotional dysregulation, given the trauma,
00:53:05
Speaker
you know, it's it's that that that's what's needed. So that's a direct cause, all all of that. So part of it is This allows you to take an enormously complex world and begin to make sense of it. it it's it really It's about understanding the process that generated the data.
00:53:28
Speaker
that That's what this is. It's about understanding whether your model corresponds to reality because reality is causal. It's how one thing affects another.
00:53:40
Speaker
So we're trying to approximate reality with this. Right, and the DSM is not a great model. Good. We're coming to the end. Farah, any thoughts? It's not even a poor model.
00:53:55
Speaker
It's a non-model. ah Well, it's what they call a polythetic model, which is a model composed of lots of overlapping classifications. So it's extremely fuzzy and idiosyncratic.

Conclusion and Contact Information

00:54:09
Speaker
oh um I think this has been terrific. um Thank you so much for being here today. And how can people learn more about you and what you're doing? know, I'm I'm at.
00:54:23
Speaker
you know i'm i met and NYU Langone Medical Center. You know, people could look for the Champ Center or the Trauma Systems Therapy Training Center.
00:54:35
Speaker
ah Perhaps we could put a link in in this for that. But well I love talking about this, perhaps, as you could tell. And I love to discuss it with people who might reach out. Yeah, I didn't get to ask you about slime molds, but ah maybe another time.
00:54:53
Speaker
Okay, that's for sure. ah Thanks for joining us. Okay, thank you for inviting me. Bye. Bye-bye. Remember, the Doorknob Comments podcast is not medical advice.
00:55:04
Speaker
If you may be in need of professional assistance, please seek consultation without delay.