Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
S8 E1: Cory Clark, Adversarial collaboration and rebuilding trust in academia image

S8 E1: Cory Clark, Adversarial collaboration and rebuilding trust in academia

S8 E1 · The Free Mind Podcast
Avatar
8 Plays1 year ago

Cory Clark is Executive Director of the Adversarial Collaboration Project at the University of Pennsylvania, where she is also Visiting Scholar in the Wharton School and the School of Arts and Sciences. The Adversarial Collaboration Project brings together scholars who have contrasting views on important scientific questions to work out their differences through rigorous collaborations. It is based on the idea that viewpoint diversity produces better science. We discuss this project, as well as Dr. Clark’s other work on trust in academia, nuances in gender bias, and more.

Recommended
Transcript

Introduction to Free Mind Podcast

00:00:04
Speaker
Welcome back to the Free Mind podcast, where we explore topics in Western history, politics, philosophy, literature, and current events with a laser focus on seeking the truth and an adventurous disregard for ideological and academic fashions.

Adversarial Collaboration Project Introduction

00:00:19
Speaker
I'm Matt Burgess, an assistant professor of environmental studies and faculty fellow of the Benson Center for the Study of Western Civilization at the University of Colorado Boulder.
00:00:29
Speaker
My guest today is Kori Clark. Kori Clark is executive director of the Adversarial Collaboration Project at the University of Pennsylvania, where she is also a visiting scholar in the Wharton School and the School of Arts and Sciences. The Adversarial Collaboration Project brings together scholars who have contrasting views and important scientific questions to work out their differences through rigorous collaborations. It is based on the idea that viewpoint diversity produces better science, an idea we strongly believe in at the Benson Center.
00:00:58
Speaker
We discussed this project as well as Dr. Clark's other work on trust in academia, nuances and gender bias, and more.

Role and Impact of Adversarial Collaboration

00:01:13
Speaker
Corey Clark, welcome to the Free Mind podcast. Thanks for having me. So you are the executive director of what I think is one of the coolest projects in academia right now, Penn's Adversarial Collaboration Project.
00:01:27
Speaker
So tell me, what is adversarial collaboration? Why is it necessary? And how is your project facilitating it?
00:01:36
Speaker
Yeah, so adversarial collaboration is essentially the idea that when scholars disagree with one another, they should get together and figure out what's going on, like figure out the nature of the disagreement. Are they actually disagreeing? And if so, how could they test which of their claims is closer to the truth, or maybe they're both a little bit right in different contexts. The term itself was coined by Danny Kahneman, so that's not my
00:01:59
Speaker
my term. But with Phil Tetlock, we're essentially trying to make this be an incredibly normal and typical part of science where
00:02:12
Speaker
When one scientist realizes they're publishing work that's contradicting another scientist's work, the first thing they should do is get together and work to figure out what's going on with their disagreement rather than just pumping contradictory information out into the scientific literature for years or decades, which is what currently happens.
00:02:33
Speaker
I think that this has potential to expedite scientific progress because rather than having scholars just creating tons of information on both sides of controversies and debates that contradicts one another and confuses people about what is true, how can we design effective policy if we don't know what's true.
00:02:53
Speaker
If scholars can figure out why they're having this disagreement and what is really the truth between them, we can make scientific progress faster. And then my hope is also just that
00:03:04
Speaker
It can sort of change the culture of science, which right now is very competitive and people have a lot of enemies. And if you publish something and someone else contradicts what you publish, you hate that person. They hate you. And you create these little factions of scholars over time that dislike each other because they disagree empirically. I think if people could get together and become collaborators, it potentially could turn
00:03:29
Speaker
you know, enemy scientists into sparring partners who help us improve our science rather than as enemies that we're trying to destroy or they're trying to destroy us and I have to destroy them first. So that's my hope. We're working to, we're supporting adversarial collaborations ourselves. I'm participating in a bunch of them myself.
00:03:51
Speaker
I've also written several articles and chapters and popular press pieces on why we need adversarial collaborations and trying to persuade scientists that it won't end their career if they participate in one of these. So yeah, making the persuasive case and then also trying to lead by example, I guess is how we're tackling that one. Is there an example that you facilitated through your center that
00:04:22
Speaker
is your favorite and that you're able to share, obviously. And maybe there's some projects that are early stages that you can't talk about, but what's one that you really excited about?

Examples and Challenges in Adversarial Collaboration

00:04:30
Speaker
Yeah. So one thing is they sort of exist on a continuum. Like some of them are much more like clearly adversarial than other ones. The ones that are less adversarial are a little bit easier. Um, but I don't, I wouldn't say I have a favorite. I have two that are,
00:04:46
Speaker
hopefully getting close to publishing. And in both cases, they really kind of showed that both sides are a little bit right and a little bit wrong, which I think is the way these are going to tend to go. One is looking at, for example, the rigidity of the right idea, the idea that conservatives are more cognitively rigid than
00:05:05
Speaker
uh, you know, people on the left. Um, but that one's been fun because everyone's been really chill and mellow and getting along well. And people will even argue a little bit against their own side when we're having debates over email. So, uh, yeah, that's been a fun one and hopefully it'll be, we'll see. It's going to go back under review soon. Nice. What's the other one? Um, the other one is one we're looking at accuracy nudges and whether they're, uh,
00:05:36
Speaker
whether they're effective across the political spectrum. And that one too, hopefully will be published soonish, we'll see. What's an accuracy nudge just for our listeners who don't know? It's just trying to like bring people, well, there are a bunch of different ones that are used in these studies, but it's essentially like, are there ways we can get people to care about the accuracy of information they're sharing online? So as a way to potentially deal with misinformation and people sharing misinformation. So would, would Twitter,
00:06:05
Speaker
Saying, are you sure you want to post this without reading it? Would that be an example of an accuracy? Yeah, I don't think that one was included, but that probably would be one. Yeah. Okay. Cool. I don't know if we know if those are effective or not at stopping people from retweeting things they haven't read. It's interesting that you say that most of these collaborations tend to find that there's truth in both sides.
00:06:29
Speaker
That doesn't surprise me at all, right? A lot of, a lot of scholarly debates are about emphasis as much as they're about facts. In fact, a lot of political debates are about emphasis as much as they're about facts. That's, that's like exactly what we see. It's, you know, okay, we all agree that this effect is there, that it's this size, but how important is it? Or, you know, how big of a deal is it? Like, you know, is this worth studying more? Is this worth designing interventions to combat like,
00:06:57
Speaker
It's all not, it's not all, I don't want to say it's all that, but a lot of it is not actually even a disagreement over the facts. It's a disagreement over like the value of the fact. Yeah. Well, or on the emphasis, I mean, uh, one of my favorite examples of an adversarial collaboration that predated your, your center.
00:07:17
Speaker
was I think Jonathan Haidt convened a group of Brookings and AEI, through Brookings and AEI, to talk about poverty, to write a report on consensus report on poverty, and they brought liberals and conservatives together. And as I remember, the story was that the liberals convinced the conservatives that racism and racial inequality was important, and the conservatives convinced the liberals that family structure and fatherlessness was important.
00:07:44
Speaker
So again, totally, totally an example of an emphasis, an emphasis thing, right? Those are one of the, one of the issues.
00:07:51
Speaker
that I'm really interested in right now is this debate about whether economic growth can be sustainable. And it's like on one side of people who say it's not biophysically possible to have growth forever. And then the other side of people say that growth has been directly responsible for our material progress, which has been incredible and indirectly responsible in some way for our moral progress, which has also been incredible. And like, those aren't mutually exclusive. Let's talk about that. Maybe I'll send you a proposal about that someday.
00:08:21
Speaker
I want to ask you about, to the extent that you can share, trying to recruit people into these adversarial collaborations and what incentives might be at play. Because I can imagine that there are some disincentives, right? For example,
00:08:37
Speaker
You're worried that your idea is wrong, right? Especially in some of these morally charged fields. It's not just that scientists are worried about their intellectual legacy not holding up, right? In many cases, they're also worried about their deep moral convictions being undermined. But then there's also an element of reputation and legacy. And it actually seems like there's a famous example
00:08:59
Speaker
of an adversarial collaboration in fisheries, which I won't bore you with the details of, but in a nutshell, it was really beneficial to the reputations of both sides to do the adversarial collaboration, because they had been fighting in the press, and the paper that they wrote together, which was published in Science, became maybe the canonical fisheries paper, and they both
00:09:25
Speaker
you know, both, it was a large team of authors, but there were two scientists in particular that were leading it who had been adversaries, you know, both gained enormously reputationally. So how have you... In that one though, was the outcome one of these kinds where it's, they're both kind of right? Like it wasn't one of them was devastated and the other one was victorious. It was a little bit, it was a little bit in between. So it was definitely that one side was kind of,
00:09:53
Speaker
more right than the other side. But then there was also kind of an emphasis thing where the, you know, one of the papers is, I guess it's hard to heard without the specifics, but basically the, you know, one of the papers like that, that sparked the debate
00:10:11
Speaker
made a claim that held up about the fact that fisheries were having damaging effects on ecosystem services and different domains of the ocean. And then it made a claim kind of towards the end that fisheries were on track to all be collapsed by 2050. So the second part of that didn't hold up. And the second part of that was central to the part of the debate that was hot.
00:10:39
Speaker
Yeah. But the first part of that was, is important and did hold up. And so it's, it's, I guess it's kind of in between maybe, maybe it is somewhat like your, your examples. Anyway, tell me about, uh, tell me about the, your. Experience trying to get people to get involved in these things. So I've been fairly lucky so far in terms of recruiting people, but that might be because of the people that I've been recruiting, you know, like people that I know and people that I personally think would be capable of participating in one of these. And that I think, you know,
00:11:10
Speaker
at least tries to prioritize the truth over just advancing their own careers or a particular political agenda. I do think that they would be really hard for some people and especially some people who have really tied their reputation to a particular empirical claim and that they've been working on that claim for say 20, 30 years and that's sort of why they're famous and why people respect them.
00:11:34
Speaker
Those people I wouldn't for the most part expect to be willing to participate in one of these because as you said like the risks can be way too high if it's if it's a case where Literally their entire career could be proven wrong
00:11:49
Speaker
Um, I don't think scholars would want to participate in, uh, in that kind of exercise, but this is why I think, you know, it would be better if we can sort of make this, uh, like change the cultural norm of adversarial collaborations, get the young people in the people who still have
00:12:07
Speaker
careers ahead of them where they can find something new or do something different. If I'm disagreeing with a scholar, you know, when I'm an early career researcher and I can get that sorted out now, that actually probably will help my career because I'm going to be doing better science. I'm going to. Right.
00:12:24
Speaker
not waste my time on a dead end if I am wrong. Um, but if you're already at the end of a dead end, then I don't know. Like I don't think we could expect those people to, I'm not saying nobody would change their mind at that point, but I think most scholars would be really reluctant to do so. The other thing is they really are more time consuming and that's a problem because scientists, they want to publish as much as fast as possible.
00:12:53
Speaker
And these are not helpful for that. So it's like you're writing the paper with reviewer two, right? Exactly. It's not even writing the paper, but making every single decision with reviewer two, which should, and I think it's been true that the, we do end up developing higher quality methods.
00:13:13
Speaker
Um, because of that process where each side is sort of vetoing the other sides of big, no, this is why that's bias. This is why that's a bias. This is, you know, get rid of all of the things that seem skewed in one direction or another and come down to, okay, this is the only thing we can agree on. That is a fair way to test this question. So you're getting better methods that way, but it's slower. I mean, even like negotiating words that are included in the abstract or general, I've
00:13:41
Speaker
I've almost never had that happen in a traditional collaboration where everybody wants to frame things the exact same way. Whereas an adversarial collaboration so far I've seen it's really quite normal for people to debate over like, can we include this word here? Or can we change this word? So yeah, that's a problem. I'm trying to incentivize them by
00:14:06
Speaker
I've been providing funding to people. I've been providing my assistance

Reputation and Integrity in Scientific Collaborations

00:14:09
Speaker
to people. I do think that there's a reputational benefit to participating in them because it creates the impression that you're a scholar who cares more about the truth than just protecting your career, pushing a political agenda. So hopefully that balances things out a little bit. And then my other hope is that if I can,
00:14:31
Speaker
If it is true that these produce better science and the scientific community comes to value them, then they'll also be evaluated more favorably in the peer review process. So like, I don't know if you've ever conducted a meta analysis, but it's a huge pain in the ass. Like it's a very time consuming and you might get.
00:14:48
Speaker
one paper out of that much more laborious process. So I'm thinking if people are willing to do meta-analyses and the reason they're willing to do them is because they get published in higher tier journals. That's right. If the same is true of adversarial collaborations, then that should incentivize people too. And I hope it is true. And I think it might be true. Is that your hypothesis? That they probably produce better science.
00:15:16
Speaker
Well, I'm sure they produce better science, but that the, you know, much ink has been spilled on the various ways in which the incentives of the academy can be flawed. And yet I still hold out enough, I think reasonable hope, you know, not kind of Pollyanna's hope, like accurate prior hope, uh, that, that it is the case that, that there's enough value in adversarial collaborations.
00:15:46
Speaker
that they do probably get published. They will probably get published disproportionately in high-impact journals. And also, Reviewer 2 is friends with somebody on the other team probably, right? Yeah. Yeah. I think I've only been doing this for two, how long? Two and a half years, maybe. So I've only had a few papers
00:16:13
Speaker
that I've been working on start their way through the peer review process. But I've only seen positive things be said about the fact that they're adversarial collaboration. I haven't seen anyone say, oh, we can't trust this. It's an adversarial collaboration. So I do think they're likely to be viewed more favorably. There could be a criticism. And this is one thing I've thought about.
00:16:34
Speaker
let's say the scientific community really does come to value these, then it will incentivize scholars to pretend to be adversaries when they're really not and just like take on a project and be like, I predicted X and they predicted Y. Look, we did an adversary collaboration. But really, you know, they were just, I don't know if that's really a big deal if that did end up happening, but
00:16:55
Speaker
Is it possible that that even would be better than the status quo, right? Cause it would make people steel man themselves, even if they don't genuinely believe.
00:17:04
Speaker
The other side? Yeah. Like even flipping a coin and randomly assigning each person to, okay, you have to defend X and you have to defend not X go. I do think they probably would end up with more with fairer methods than what most scientists currently do, which is how can I prove the thing true? And what methods can I design to prove this thing true that I already think is true?
00:17:30
Speaker
So yeah, I do think that probably still would be better even if it was, you know, a little bit of a misleading frame for how they're doing there. Here's another thing that I wonder if you've thought or worried about. So people often currently who are paying attention to these issues worry about concessions that are made in writing the paper compared to what the authors think, either to pitch it for the journal,
00:17:56
Speaker
Or to, you know, negotiate among co-authors or, you know, serve some policy interest. I'm thinking about, you know, we did an episode, uh, earlier this semester with Roger Pilkey Jr., who's written a lot about this in the comics of COVID origins. Do you ever worry that the adversarial collaboration will work so well sociologically that in a case where one side really is right on the facts, that they kind of water that down to, to, to give the other side some dignity?
00:18:27
Speaker
That's a good question and yeah, I bet that
00:18:32
Speaker
I can definitely see how that would happen. Cause you, let's say you are doing an adversarial collaboration and you're actually getting along reasonably well and you get the results back. And at this point now your friends with your collaborator, they predicted X, you predicted Y, you see Y, then you probably do have some sympathy for the other person and you might be like, okay, well we can say this thing. So that probably could happen. Although I guess as you're,
00:18:59
Speaker
Kind of point like i don't know how different that would be than what goes on now cuz certainly right i've had to say things in papers that i don't fully agree with or frame things in ways that i wouldn't frame them because of. Something the editor or a reviewer said right so yeah. Yeah i totally make sense.
00:19:21
Speaker
So one last thing on this general topic of adversarial collaboration is funding. It seems like the incentives are somewhat there, but not entirely there for academics to participate in this. And we talked about the incentives for and against. It seems like if funders got on board with this as a, as a gold standard and journals get up board with this as a gold standard.
00:19:47
Speaker
that would move the needle enormously. I mean, I'm thinking about like how many NSF grants, you know, say they're going to do workshops in high schools now, right? Because of the broader impacts thing, right? And that's lots of instances where that's a great thing, but that's, but that sort of, it seems like when it funders decide collectively that we want a certain thing, academics usually do fall in line. Yeah. Um, I think that would be huge if a lot of the fun, like we can look at government funding agencies, like,
00:20:17
Speaker
NSF or NIH or any of those that really are trying to solve specific problems and really therefore do care about the quality of the science that comes out of those projects. They should be more likely to support an adversarial collaboration than to support a scientist who they already know has a particular agenda and wants to find a particular outcome.
00:20:39
Speaker
Same goes for really any of the nonprofits that have particular problems that they're interested in solving. If their success at achieving their mission depends on having accurate information, then it would be in their interests to fund adversarial collaborations too. I think potentially one reason you don't see
00:21:04
Speaker
that explicitly mentioned in my calls for funding proposals is just because they're so rare and so few scholars actually do them. It might also deter some people from applying in the first place, so that's maybe a problem too. But if it was just something like, you know, we look favorably upon adversarial collaborations, that would nudge scientists to consider that approach and would help people adopt
00:21:28
Speaker
the model of adversarial collaboration, I think, quicker than what would happen if it's just me still trying to persuade people. Yeah. So let's talk about a specific adversarial collaboration that you didn't facilitate, but that is related to some other things that you've written about.

Gender Bias in Academia - Evolution and Impact

00:21:45
Speaker
So there's a paper recently that came out by C.C. Williams and Kahn, which was an adversarial collaboration on gender bias and academia.
00:21:54
Speaker
As I understand it, the gist of what they found was that women are disadvantaged compared to men on average in teaching evaluations and salaries. Men are disadvantaged compared to women in hiring, and it's more or less even elsewhere. First of all, is that a correct representation of that study?
00:22:18
Speaker
I think I haven't read it for a few months or a couple of, whenever it came out. But that sounds right to me. I think generally what they found was that, so they looked across six domains, I believe. And whereas a lot of people would say, you know, academia just discriminates against women pretty much across the board. And that's a really popular finding to publish.
00:22:41
Speaker
The, the picture was much more nuanced and the differences were much smaller. Like I think, for example, I think there was a salary difference, but it was fairly small. Uh, and especially in comparison to the numbers that people often throw around. And then same with the teaching evaluations. Those are complicated because, uh, you know, if you're, if you're actually looking at real teaching evaluations, then there's potentially a real reason why there would be differences there. So you really want to be focusing on these.
00:23:12
Speaker
experimentally manipulated like online teaching classes where the students think their teacher's one thing and not the other. So, you know, I think the findings were essentially just a little bit more complex and suggest that we can't just be making these broad claims that academia discriminates against women because it depends a lot on what kind of study you're looking at, where you're looking, when you're looking, and which domains we're talking about.
00:23:37
Speaker
So there was another big study that came out that was not just looking at academia. It was looking across all domains and found that several decades ago, there was a pro-male bias in male-dominated fields and a pro-female bias in hiring in female-dominated fields. And then the pro-male bias in male-dominated fields shrunk and reversed in around 2010.
00:24:02
Speaker
uh, and has been maybe a small pro female bias since then. And then also, I'm a low level coauthor on that one. Um, and actually the forecasting tournament we did in that one, which was asking scientists and everyday people to predict the results of the metanalysis that kind of was born out of the adversarial collaboration project. Um, because we had Greg Mitchell, who was working with us on some of that, some other projects was kind of consulting on that part of
00:24:33
Speaker
that project. So it's not actually an adversarial collaboration, but it kind of was conceived that way. So there's three follow up questions I want to ask you about these two, the findings of these two studies. The first is, it strikes me that there's been
00:24:48
Speaker
huge changes in how academia focuses on biases, identity biases. So there's stories coming out about departments and universities basically exclusively doing diversity hiring after 2020. I've also heard lots of really bad stories about the sorts of bias that people think is still pervasive from 30 years ago and occasionally from more recently than that.
00:25:17
Speaker
So when you're doing this kind of a meta-analysis, obviously in the second paper you were explicitly looking at how things have changed over time. But if you've potentially had huge changes in the last five years, how do you think about getting to the bottom of that?
00:25:32
Speaker
What do you mean? Well, so for example, suppose that you had, you know, a bunch of studies from 2010 that said there's lots of subconscious bias that affects various groups. Right. And, and yet, you know, post 2020, there's probably been a lot of conscious bias going the other way. Yeah. How do you incorporate that into a meta analysis? Yeah. So I think any time you're looking at.
00:26:00
Speaker
something that wouldn't be a stable fact over time, and it's something that's a cultural issue, like something that you would expect to change as culture changes, then you would want to look at the impact of time on your outcome. And so in the case of the one on hiring bias, that's why they looked at the issue over time. But a bigger problem, not a bigger problem, but another problem is that
00:26:28
Speaker
there's a lot more science now than there was 20, 30, 40, 50 years ago, and a lot better science now than there was 20, 30, 40, 50 years ago. So it's also kind of hard to make those comparisons because you're comparing, you know, 20, 30, 50 studies that have been conducted in the past five years to three that were conducted in the seventies and were really crappy and had small sample sizes. So those things are hard to look at. I do think people often forget,
00:26:55
Speaker
that anytime you're looking at a scientific finding that is just like a potential cultural moment to keep in mind that a study from 2010, if it's a statement about culture in 2010, it's really not all that useful anymore. And when we're having these debates, we need to be thinking about, you know, what is the latest science show? And if we don't know, if we don't know of any science that shows anything in the past three years, then someone needs to re conduct these studies. Right.
00:27:24
Speaker
which is what happened with the hiring bias. You can find plenty of studies that show discrimination against women. Those studies are out there. Um, but the question is how old are they? What domains are they in? And then, um, uh, and then also, you know, what has happened as a result of knowledge of those studies. So if the whole world thinks we're discriminating against women,
00:27:50
Speaker
they might start discriminating against men to balance things out and then kind of forget that over time, that's going to change the reality of discrimination. And then, and then we find the situation, oh, we're slightly discriminating against men. Do we want to now flip things again to make up for the fact we're discriminating against men or like, can we just get rid of gender as a criterion when we're deciding who to hire?
00:28:14
Speaker
treat people as people and what a shocking concept. It's a radical idea. Okay, my second follow-up question. I think you found in a study that you were a co-author in that experts, domain experts, were especially bad at anticipating your results. Is that right? So we found that behavioral scientists were slightly about, oh, they should say both everyday people and behavioral scientists.
00:28:41
Speaker
radically overestimated gender bias and in both directions. They thought stereotypical, stereotypical, uh, like male, stereotypically male jobs, uh, they thought bias in favor of men would be like much larger. I think academics over five times and everyday people like over 13 times, whereas the effect size was like, like, it was like 1.4 times, I think as high. Uh, and similarly for fee stereotypically female jobs,
00:29:07
Speaker
they drastically overestimated how much females are favored. So these actual gender biases are quite small. And in fact, they were always much smaller than scientists today thought they were.
00:29:20
Speaker
An economist that's not surprising, right? Because bias to an economist that's not surprising because bias has a cost, right? Lots of people have written about this. Thomas Sowell wrote about this in his recent book. You know, if you're in a field that has a lot of direct
00:29:41
Speaker
reward feedback on merit, which I think some people could argue that academia could use more of, right? That creates a strong economic disincentive against any kind of discrimination because any kind of discrimination that comes at a cost to merit has a cost to your bottom line, right? And so there's, there's, there should be a strong feedback in any field that has, that's getting, you know, substantive material feedback on merit, you know, should have a strong disincentive against.
00:30:08
Speaker
But that question then is why are behavioral scientists and everyday people so wrong about that? Why do they think things are so much more unfair than they really are? I don't know. I want to put part of this as a question to you because it touches on some other stuff you've written about.
00:30:27
Speaker
My guess is that part of it, in terms of overestimating all types of bias, just has to do with the fact that people overestimate all types of bad stuff in general, right? Yeah, that's true. One of my favorite studies is the ones that show that people think that their country is in bad shape, but on average, they think their neighborhood is in great shape. And it's like, if people weren't biased, those two would have to average out to be the same, right? Right. And I imagine you see some of that
00:30:57
Speaker
among academics. It's like, well, you know, academia is super biased. Our department isn't so bad, but academia is horrible, right? But okay, but let me, so besides science too is like scientists are so incentivized to exaggerate the importance and significance of their findings. So
00:31:17
Speaker
Anytime a scientist finds like a teeny tiny bias against women, they're like proof sexism. And then we just kind of interpret it as this is a really big deal, not this is a teeny tiny effect size. So one other explanation for, you know, half of that misperception finding that that's related to something that you've written about before, which I think is interesting is the idea that
00:31:42
Speaker
we might paradoxically overestimate biases against women because of traditional gender frames that frame women as unagenic and more easily victimized and maybe also more needing of empathy and help when they're victimized, right? So I think you've written about this with Bo Weingard. Tanya Reynolds has done some studies on this, I believe. Is that part of the story too? And can you elaborate on that?
00:32:12
Speaker
Yeah, so there have been really over the past few decades, there have been quite a few studies, even like Alice Eagley, who is kind of I think sympathetic to the, you know, potential biases against women. She's famous for the women are wonderful effect, which is just like the idea that basically people
00:32:33
Speaker
think women are better than men and like women more than men. And this has been replicated across tons of different contexts. I think what you're talking about, what I have looked at, and also Steve Stewart Williams has looked at similar things where he has found and I have found that people prefer scientific findings that portray women positively and more positively than men.
00:32:55
Speaker
relative to findings that portray men more positively than women. So they really want science to show that women are better than men, and they do not want science to show that men are better than women. And people kind of have a kind of pro-female bias where they care more about women, they care more about harm to women, they care less about harm to men, and women also get punished less harshly for the same kinds of things.
00:33:19
Speaker
So you have all these findings that point to a pro-female bias. And sometimes they're even framed as sexist against women. Like there was this one that found, I'm forgetting the names of the two authors of this paper, but they found that
00:33:37
Speaker
When a person finds out that a piece of writing was written by a woman, they increase their estimates of the quality of the essay or whatever it was, whatever the piece of writing was, and they frame this as like sugar-coated feedback to women. So this was harming women because we're giving women really positive evaluations.
00:33:56
Speaker
And Steve Stewart Williams and some others have argued that this might be a sort of evolutionary thing where we care more about the wellbeing of women and women's survival matters more. Malusability, cheap sperm, expensive eggs, that kind of thing. Exactly. And I don't think that's wrong. I think that's probably part of it. And that probably explains things like why we have probably always cared more about harm to women and value their lives more. You know, we send men
00:34:26
Speaker
to war and women stay home and the men protect the women. I think it explains stuff like that. I don't know that it explains these findings where people like science that shows women are better than men compared to science that shows men are better than women because I just don't think we would have seen this effect like in the 1950s or the 1920s. Like if a scientific article said women are smarter than men,
00:34:54
Speaker
in the 1950s. Would people have been biased in favor of that compared to science that says men are smarter than women? I mean, maybe, but I'm certainly not confident. Women were portrayed as like, you know, hysterical and irrational and like
00:35:09
Speaker
I don't know if they were framed as dumb, but they were framed as like, like incapable of culture and that kind of hypothesis would also be against that finding being found in the fifties. Right. Right. The reasons that you just said, um, which I think just points to, again, it's something cultural right now that everybody wants everything good to be about women and everything bad is about men. So that's really interesting. That's really interesting nuance. Cause so if you put that
00:35:38
Speaker
very recent culture aspect aside, the best explanation, or maybe the most complete explanation I've seen for all the different, you know, how do you reconcile all the findings that find, you know, bias against men in some contexts and women in other contexts is I think from Tanya Reynolds, basically framing it in this probably Steve Stewart Williams like evolutionary context, basically saying that the unifying feature in gender bias is about agency, that basically people assign men more agency
00:36:08
Speaker
And so that means if men do something bad, then they get more blame for it. But if they do something good, maybe in some cases they get more credit for it. And maybe they're expected to be, there's more of a default expectation that they're going to be in very, very high status roles that are all about having agency. Does that make sense to you as a framework for understanding, for reconciling all these different findings that sometimes seem to be in conflict?
00:36:36
Speaker
Um, possibly, I mean, I think there's a relationship between our sort of moral concern for men and women and how we think about agency. I mean, that's using the Kurt Gray framework of like dyadic morality, the more someone's like an agent, the lesser a patient and vice versa. But I'm not sure it has to be that complex because it can be something really quite simple, which is just that we care more about harm to women. And that's as complex as it needs to be.
00:37:05
Speaker
Um, because that means we should want to punish women less harshly when they do bad things because we don't want to harm women because when you punish someone, you're harming them. Um, you're creating some kind of costs for them. That's putting them at risk. So if we just kind of have a general bias where men's lives are more disposable than women's, um, then I think you kind of can explain all of those sorts of findings, but
00:37:32
Speaker
again, you still fail to explain more recent biases that seem to that favor women even for like
00:37:39
Speaker
Agentic type things where we think women are smarter. They're better leaders They're you know, they're more competent in almost everything I mean even things that come like looking at people seem to have biases about wanting women to be as Strong or stronger than men and as fast or faster than men which are all things you might associate with like agency but now we have this pro-women bias and it just spreads across everything and
00:38:04
Speaker
Okay, one last question on this and then I want to move on to one more topic. So you somewhat critically referred to an example of people framing this bias as sexism against women. Permit me to annoy you by pulling that thread a little bit, right?
00:38:28
Speaker
I, you know, without getting into specifics, I know somebody quite well who, you know, when she was in high school was the head of her robot, the robotics club in her high school, you know, super into STEM. I'm from Canada and so is this person I know.
00:38:48
Speaker
And then a Canadian woman won the Nobel prize in physics a few years ago, Donna Strickland. And so they had this big event for local high school students. I think it was on, you know, women in physics. And this, this person I know, uh, I think was expecting to go to something about how awesome women in physics are doing, including this woman who won the Nobel prize. Uh, and the, what I heard was that the speakers
00:39:17
Speaker
We're all about how horrible it is for women in physics and how it's going to be horrible to go into physics if you're a woman. As you can imagine, this person I know who was a very talented woman in physics in high school did not major in physics. I don't know if that's the only reason why, but I do worry also if you look at some of the mental health statistics you see among young, particularly liberal women.
00:39:41
Speaker
I do worry that some of this, this narrative of doom and gloom, even though it's intended to be helpful, right? It's intended to be anti-sexist is harmful. What are your thoughts on that? So that's a little bit of a different issue than when I was talking about, I was talking about giving women basically sugar-coated feedback and saying that that could be harmful to women, which it certainly could be, but it's not clear to me that on whole, if people,
00:40:11
Speaker
look at the work of women more favorably than the work of men, even if that is sugarcoating feedback to the woman. On whole, it could help her career. I don't know. I'm not sure. What you're talking about, I do think is kind of an interesting, it seems to be a contradiction to me where people are really trying to, or at least they say they're really trying to get women into STEM and promote women in STEM.
00:40:40
Speaker
But then they portray STEM as this horrible place for women. And I think what they think they're doing is that by shining a light on how horrible it is, then it will become better for women and then women will want to go into STEM. But I don't think they think about what you're pointing out, which is if you
00:40:59
Speaker
if you make it look like STEM is a terrible place for women, you're not going to make women want to go into STEM or you're going to make them think, why would I go there? If everyone's going to be, I'm going to have to work twice as hard and everyone's going to be sexist against me and like, I'm not going to get what I deserve. Um, yeah, that doesn't sound like the kind of place that a woman would want to be. Um, it's like subverting the message that you're trying to send. It reminds me of like, uh, I, I talk about sometimes at the context of climate change that we sometimes in,
00:41:27
Speaker
In the context of American climate change activism, we say things like, America is this horrible backward country that's never done anything right. All we've done is screw over the developing world. But developing world, you should follow us into the climate sunset because we know what to do. Also, you should be able to live here because it's much better here than where you live.
00:41:52
Speaker
Right in the, yeah. And I always, I always imagined, you know, like the leaders of Saudi Arabia be thinking like, well, wait, why? But actually to tie this back to what we were talking about earlier with the gender bias, it actually still could be strategically the right thing to do. Because if what you care about is say advancing women in physics, by accusing physics of being sexist against women, you might be creating pro female biases within physics by
00:42:22
Speaker
writing that narrative. And so when women do get into physics, they actually are treated better than men. Unfortunately, if that is actually happening, then the tail becomes a lie. But it's not clear to me that telling, at least telling the world that physics is sexist against women, that actually could end up helping women in physics, even if it might also deter some women from wanting to participate.
00:42:49
Speaker
Yeah, and create emotional harms. That's an interesting nuance. I'm not sure if anyone's explicitly thinking about that. Sure, sure. Okay, so I want to talk now, shift gears a little bit.

Censorship and Risk in Academic Research

00:43:01
Speaker
You had a paper that came out in Proceedings of the National Academy of Sciences, which is a very prestigious interdisciplinary journal.
00:43:08
Speaker
for our listeners who don't know, looking at censorship of science, censorship by and of scientists and sometimes of themselves. I think the main takeaway, correct me if I'm wrong, was that although there are external pressures for scientists to be censored from both sides, Twitter mobs exist on both sides,
00:43:33
Speaker
The on the ground reality is that it's largely scientists doing it to themselves and each other. Is that correct? Yeah, I don't know if we could actually like quantify it, but scientists are interacting with other scientists. Scientists kind of control outcomes for other scientists. So peer reviewers are scientists, editors are scientists, people on hiring committees are scientists, people who are giving out awards are scientists. So the people that
00:43:59
Speaker
scientists are trying to impress are their peers and their peers hold the careers of their peers in their hands. And so it creates this, this kind of community where, you know, being ostracized is hugely costly. So you don't want to rock the boat. You don't want to piss people off. Um, and getting status is very rewarding. So what you want to do is you want to like show all of your peers that, you know, you're fighting the good fight and you're on the right side of whatever it is they're trying to do.
00:44:30
Speaker
So you get a lot of self-censorship. Scholars are afraid to speak openly about what they believe is empirically true about the world, which is arguably like what their job is, is to talk about what's empirically true about the world. Yeah, what a devastating indictment, right?
00:44:45
Speaker
Yeah, that they can't say what they think is true because they're afraid of judgment and ostracism from their peers and they're afraid of paying a huge professional cost if they speak openly about the truth. So that's going on. Then you've got a lot of like, we frame this as kind of like a pro-social form of
00:45:05
Speaker
censorship, but you get, you know, peers trying to persuade scholars. And I've had plenty of peers try to tell me not to study certain things and not pursue certain topics, because they're trying to give you like friendly advice, like, Oh, sure, you could do that, but your career will suffer. So don't. So you get a lot of that. And then you have like the a little bit more aggressive, which is, you know, if you want to call them like cancellation attacks, right, trying to get other scholars fired, scholars trying to get other papers retracted, scholars trying to get
00:45:34
Speaker
people booted from panels and presentations and disinviting them from giving a talk. So all of that's happening too. But all of that, not all of it, a lot of that too is happening like in the name of like ethics or morality. So it's, we can't have this scholar talk at our university because the students will be traumatized. So it's about protecting other people perceived as vulnerable from the views of particular scientists
00:46:04
Speaker
And a lot of the time it's just scientists doing that to other scientists. And so they're doing it for their scientists are doing it themselves and each other for a combination of moralistic and careerist reasons. Is that a good summary? Um, yeah, I think that's probably a good summary. Yeah. For, you know, protecting other people from harmful scientific findings or at least findings they think are harmful and then, um, yeah, protecting themselves and their
00:46:32
Speaker
colleagues, their students from pursuing findings that they think will be socially costly. So I'll preface this question by saying that obviously cancel culture is real and I recommend the book The Cancelling of the American Mind by Schlott and Luciana for anybody who doubts that, right, which I just read.
00:46:54
Speaker
But I also wonder if some of the incentives to tow a particular line or to not rock the boat are somewhat overestimated by academics who are, I think, paradoxically selected for, often for type A personalities, which tends to correlate with risk aversion. And yet in the context of a right skewed productivity distribution, there's rewards to risk, right? The way I put it to my own students,
00:47:25
Speaker
is I say, you should all decide how controversial or non-controversial is your comfort zone. And I can't tell you what your comfort zone should be, but you should also know that if you want to do paradigm shifting research, you're going to have to break somebody's paradigm.
00:47:44
Speaker
that person is on average going to be powerful and upset. Right. And so, and, and if you successfully break a paradigm, you actually still are, I think pretty handsomely rewarded for it. Um, and, and so to give us, to give a specific example of where I wonder if sometimes or an example that illustrates the, where I think sometimes people underestimate the incentives for, uh, challenging the narrative is there was a controversy in climate change recently.
00:48:13
Speaker
Um, where a climate scientist published this paper on wildfire in nature and then wrote an article in Barry Weiss's outfit, the free press basically saying, you know, I frame the design of the study and the narrative of it to fit the climate narrative. Cause, uh, cause I thought that that would help it get into nature. Now, of course, uh, that free press article was a very bold thing to do, right? You can't accuse this person of being too risk-averse generally. Um, and, and this is somebody I respect.
00:48:42
Speaker
But I remember one of the things that I was struck by, even though certainly I recognized some of the incentives that he was talking about, was that a good friend of mine had published a paper in Nature on the same day as his paper came out in Nature that went against a dominant climate narrative. And probably that was the reason it got published in Nature. She and her colleagues found that basically
00:49:10
Speaker
there wasn't good evidence that heatwaves had disrupted aquatic communities in North America, I think. And that was actually also an interesting example of an adversarial collaboration because some of our co-authors had been people who are well known for publishing papers that are more alarmist about, or maybe more alarmed is kind of the morally neutral way to put it, more alarmed about climate change and marine heatwaves.
00:49:38
Speaker
So can you speak to that? Like, do you think that, you know, like, the career of somebody like Jonathan Haidt, right, is maybe an example of this, right? That he's, he's done the stuff that he's probably most known for now is stuff that was a little bit boat rocky, right? Yeah, well,
00:49:56
Speaker
For him specifically, I would say yes and no, but I take your point and I think maybe we want to draw a distinction between making a sort of scientific contribution that might be
00:50:14
Speaker
undermining a framework that had been generally accepted among a lot of your peers, I think that can be helpful to your career. I mean, making any kind of seemingly novel contribution, if it appears to be a high quality one, of course, is excellent for your career, even if
00:50:30
Speaker
you are going to make some enemies. Um, but I would say a lot of the time those enemies are going to be perhaps few in number because it's a, you know, a certain number of scholars that are, uh, have their identities tied to any particular framework at a given point in time. Um, what I'm talking about more is publishing controversial work that, uh,
00:50:55
Speaker
almost nobody in your discipline will like, or even that the public would hate. Um, that is where I think the majority of the censorship is coming from. So in my discussions with psychology professors, they're really timid about race and gender and you know, transgender identity and any of these like hot button issues that you could
00:51:22
Speaker
If you say the wrong thing, even if you're saying something you think is empirically true, you can lose your job. And not only can you lose your job, you can become such a pariah, you can never get another job ever again. And you can even reenter society. So it's like, you're not pissing off 12 Harvard professors. You're pissing off tens of thousands of people, and all of the employers are now afraid to hire you because you're so stigmatized.
00:51:52
Speaker
And I'm not saying all scientists are, if every scientist today wrote down all of their empirical beliefs, I'm not saying all scientists have beliefs that would cause that level of... But because that's kind of in the back of people's minds, they're not saying even relatively more mundane things. Even some scholars would shy away from talking about the fact that
00:52:14
Speaker
hiring biases favor women now over men. I know plenty of peers who would not want to put their name on a paper that said that because they would be booted from a very large community of people and then they'd be labeled a certain kind of scholar. Let me dig that into that. I'll hang on a little bit. I think you're definitely right that there are some topics that make you instantly radioactive.
00:52:40
Speaker
So kind of my favorite illustrative example, because it's hypothetical and not real, is there was a paper that came out in either Science or Science Advances a few years ago that looked at different human populations and to what extent there was evidence of Neanderthal DNA. And they found that the only population for which there was not evidence of Neanderthal DNA was Africans.
00:53:07
Speaker
And remember the time thinking, would they have even submitted the paper if they had found the other, the opposite, right? That would be sort of a, that paper, you know, came out to large fanfare and there, and it would have probably been instantly radioactive. So certainly there are, so I guess my first question is about those types of areas that are radioactive is how big of a problem is that
00:53:28
Speaker
for science. And to kind of steel man the argument that it's not such a big problem, let's think about public university doing open source bioweapons research, right? I think almost nobody would argue that that's good, right? Even if it slowed down our understanding of bioweapons. And so it seems like almost you could get almost everybody to agree on at least that case. And therefore there exists some line
00:53:57
Speaker
in everybody's mind, you know, beyond which, you know, some topics should exist. Now, I think as a scientist, you should never put answers there, but you can, there's, I can philosophically argue that you can put topics there, you can put questions there. So I guess my question is, do you agree with that? And, and how big of a problem is it? Do you think of kind of the number of topics that we put there? Cause I think a couple of these certainly stuff about race, some stuff about race, gender, transgender, you know, it definitely is there.
00:54:25
Speaker
in terms of how it's treated. But I think there's people who have had successful careers questioning gender bias. The two papers that I've done as a faculty member that have the most citations by far are ones that challenge the use of hot climate scenarios. And there's lots of other examples, I think, of where you're sort of
00:54:45
Speaker
There's research that's controversial that kind of wins you some enemies but also some friends and you can make a case in many of those examples and I certainly would include my own career in this so far, knock on wood, that the friends I've won have been more beneficial on average than the enemies that I've won. So what's your take on that?
00:55:07
Speaker
So I think that is correct and different scholars kind of choose different paths. So like I talk about like being in the zone of like fashionably controversial, which is where you're, you're controversial enough that you have joined a community of like controversial scholars and you're getting some status and friends there. You get invited to their things and that's cool. Uh, but you're not so controversial that you've been fully booted from mainstream.
00:55:36
Speaker
Um, whatever the mainstream group of scholars you're part of, like in my case, I suppose it would be social psychologists. I don't think I've ever really said anything all that controversial, but I think I've been mostly booted from the mainstream social psychology world. Maybe not entirely. I still have a lot of friends there, but a lot of people dislike me too. But yeah, as you've said, I've also made a lot of friends and, uh, you know, people respect me in other circles and that's been good for my career. So there are, there are trade-offs.
00:56:05
Speaker
And as you said, like different scholars might choose different strategies in that environment. How much does the existence of that culture negatively affect science? I think it probably, so I guess we would want to think about what is, what is the alter, what is the other option? Like how could it, if it were the case that scholars all critiqued each other's work, but they didn't try to ruin their careers and their lives and call them names.
00:56:35
Speaker
and make them fear for their livelihood, would that be better? I think it probably would be better. You'd still have what's happening here where you're more respected as some people and less respected by others, but it wouldn't intimidate what I would call like a sort of like moderate center of people who might side with one side or the other, but they won't speak up. And so it kind of makes it impossible to assess where is the scientific consensus on this issue
00:57:05
Speaker
because 80% of scientists won't comment. So I think that's a problem. But yeah, I think you're right. You said something else that I wanted to comment on.
00:57:19
Speaker
What was there another part to your question? Oh, I guess just like it strikes me that there's there are topics that are controversial that when you friends and enemies and maybe more friends and there's topics that that absolutely make you radioactive and I guess what I'm saying is I think I think you could have I could make a reasonable case or a case that I would find reasonable on both sides of the question of
00:57:43
Speaker
is the line between those two drawn too far, one way or

Ethical Considerations and Trust in Science

00:57:49
Speaker
the other. Well, you'd also brought up to the case of someone, a research area that literally could cause, you know, the end of humanity. Yeah. There I've actually tried to think about
00:58:07
Speaker
Like, is there a place that we can draw the line that makes sense and we can agree to it? And I think one potentially useful place is looking at designing technology or applications of science that can have huge, immediate and certain impacts on human flourishing. So I would put, you know,
00:58:30
Speaker
if your gain of function research potentially could be there depending on what you're trying to do. But when it comes to research that is just trying to explain what already exists in the world and that isn't forwarding a new technology that is
00:58:46
Speaker
catastrophic. Is that something that we could all agree? If all you're trying to do is provide an explanation for what is already happening out in the world, should scientists be allowed to do that, even if their explanation makes us cringe?
00:59:04
Speaker
Well, again, to steel man the other side maybe of this argument, I don't think they would say that the problem is that it would make us cringe. I think they would say the problem is that if you're studying why does a certain pattern exist and it's a pattern that we wish didn't exist, widespread poverty in certain communities, for example, I think the worry is that certain findings, especially if they were not rigorously arrived at,
00:59:32
Speaker
would make us not want to try to solve that problem. I think there's a concern on the other side, which is Roland Fryer has articulated that if you do bad science that makes people feel good about that problem and paints a kind of ideologically cohesive view of that problem, that also doesn't help solve the problem. And I think some of the things we're seeing with learning loss and crime illustrate, maybe illustrate examples of that. But what's your response?
01:00:01
Speaker
to take a fashionally controversial problem. Let's talk about gender bias and stuff. So if you say the reason there are not
01:00:14
Speaker
an even number of men and women as physicists is not because women are being discriminated against, but because of differences in abilities and interests and men and women self-selecting into different areas that they like more or is part of their comparative advantage. So that's a conclusion that a lot of people would dislike. It's a conclusion that's only trying to explain what is happening. The fear is, well, if we say that,
01:00:43
Speaker
then we won't try to make STEM 50-50. But there is a sort of value assumption there, which is that we should be trying to make STEM 50-50. And so it kind of becomes this just like, it's just sort of like a political controversy or like a disagreement of values where some people think that in the ideal world, men and women are like 50-50 everywhere, or at least everywhere desirable.
01:01:11
Speaker
And other people would say, no, it's perfectly fine if they're not 50-50 everywhere desirable.
01:01:17
Speaker
And then what do you do? I don't know. It's like a moral. I agree with the way you framed that issue, but I think by trying to be fashionably controversial, you've picked an issue where the stakes are too low for you really to be able to address my question. Because at worst, what you're talking about, the way your case applies to my question is, at worst, by describing, say,
01:01:46
Speaker
differences in preferences between men and women that would lead to not have 50-50 outcomes. Suppose that we conclude that and we're wrong and we miss an opportunity to address gender bias. I think at worst, what you're doing is you are shifting people around between different white collar jobs and maybe they have to join one slightly more than the other, but it's not catastrophic for society, right? But what if you're talking about part of the country,
01:02:13
Speaker
Let me present my comparison and then you can tell me if women in STEM is more devastating. My comparison example is, suppose you're talking about parts of the country where the fraction of kids that are not performing a grade level is 75% or higher, maybe even higher than 90% in some parts of the country.
01:02:40
Speaker
It is possible to imagine controversial research that leads people to the conclusion that we cannot stop that from happening. And that phenomenon, if we can stop it, which people like Roland Fryer suggested that we can at least ameliorate it, that's devastating.
01:03:02
Speaker
That's wiping out entire communities from the workforce. Now, I'm not necessarily saying that we don't, I think we do want to understand the cause of that phenomenon rigorously, so I'm not kind of saying we shouldn't, you know, we should make that phenomenon radioactive, but I am, I do understand where people are coming from who say that we should do so carefully.
01:03:24
Speaker
Yeah. And I, I understand what you're saying. I think it's a little bit, it's potentially even a little bit more complicated than that because I'm not sure. So empirical conclusions can certainly be used to justify different policy attitudes or positions, but I don't think they really often or maybe ever, not, I don't want to say ever,
01:03:51
Speaker
I don't know if they often necessitate any course of action and I don't think they tell us that we should stop caring about a problem that most people care about. I think no matter, and I could be wrong about this, maybe things would be, I've looked at this a little bit, I have a paper called Harm Hypervigilance where I've looked at people overestimating the potential harms of science and I think this is kind of where I'm thinking, what I'm thinking about here is
01:04:17
Speaker
that even a type of empirical conclusion that might seem to justify like huge inequalities somewhere, or we might say, this is going to be a hard problem to address, or it's going to be a hard problem to address in the ways that we've been trying to address it. I don't think we necessarily have to say, okay, well now we don't care about this problem anymore, and we shouldn't keep thinking of ways to fix it, even if the approach we'd been taking now no longer seems promising, right?
01:04:47
Speaker
Are we forced to give up on a social concern, depending on science, or can we continue to have those concerns? And then we use science to decide, well, where do we put our money and where do we put our time to address this problem?
01:05:05
Speaker
I think that makes sense in the abstract. I think you could still argue as a steel man in the specific case where the solution that you would do and then move away from potentially is helping people learn and build human capital. I think you could still make an argument. But we can have a whole podcast on this. I want to move on really quickly to
01:05:29
Speaker
I do want to say, I also am a little bit sympathetic to your point as well, because sometimes when I talk about things related to gender bias, you'll get someone on Twitter be like, women shouldn't be allowed to vote anymore. And so it's like, someone is drawing that conclusion, right? But usually those people have no power and no one cares about them. So it's like, I'm not so worried about those extreme things happening. Right, right. Some people are.
01:05:53
Speaker
Okay. So the last thing I want to talk about was you had another paper where you found that if people saw scientists as politicized as, you know, trying to serve an agenda more than find the truth that even people who shared their politics trusted science less. Can you elaborate on that? Yeah. So this, we looked at in the first or two of the studies, we looked at 40 different organizations and institutions and groups of professionals.
01:06:20
Speaker
And then in another one, we looked at 30 different scientific disciplines. And we looked at whether not, I mean, we did look at whether people perceive the people in that institution as leaning to the left or right. So for example, in psychology, you'd say, psychologists are overwhelmingly left leaning. People perceive them that way. And that is also literally true. But not looking at that, but instead looking at whether people perceive political values as impacting the work that they do.
01:06:49
Speaker
Pretty much across the board, we see these strong negative relationships where the more people perceive political values as influencing work, the less they trust organizations, institutions, groups of professionals, and scientific disciplines. And we did this experimentally as well, where we actually took something kind of similar to what the journal Nature did with its endorsement of Joe Biden.
01:07:15
Speaker
We did this with an organization and then we also did a, we made up a professional society called Economics Professors of America. And they invited a political speaker to speak at their annual conference. And when we have these organizations or these groups of scientists get involved in politics in some way, or seeming to make some sort of political endorsement,
01:07:40
Speaker
people trust them less, they support them less, they don't want to give them money. But this happens on both the left and the right. So even if economics professors of America are inviting a democratic government, a governor to come speak at their conference, even Democrats trust them less. And same thing goes for people on the right. So we're trying to follow up research on this question, because if this is true,
01:08:10
Speaker
that literally nobody likes when anyone gets involved in politics and it's really costly, it's really socially costly, then why do people still do it? And why do organizations still do it all the time? I have a hypothesis for what the answer to that is. Good. My colleague Roger Pilkey Jr. and I wrote a Hederox Academy blog post I think shortly before your paper came out.
01:08:32
Speaker
basically argued that we shouldn't politicize, but our hypothesis is that the two reasons, and hypothesis based on conversations and things we've written, or for red, sorry, not written, so I'm pretty confident that I'm understanding the mindset of at least some of the people who are doing this.
01:08:48
Speaker
is that either people think that science has this trusted position in society, and so we should use that to advance what we think is the right political lens. And I think that your study speaks to, or I wish I had known about it and been able to cite it actually,
01:09:06
Speaker
because we say that view is naive and that it takes the trust as given and not as conditional on us being honest brokers, honest arbiters and seekers of the facts. And then what we also argued was that
01:09:23
Speaker
Not being honest arbors of the facts is actually often harmful to the very political ends that people politicize science in service of right so you know. The devastating effects of defund the police movement on crime and long term lockdowns on learning loss again especially in disadvantaged communities maybe even the effects on mental health of liberals particularly women.
01:09:46
Speaker
you know, caused by the calling of the American mind kind of stuff. We use examples. So do you have any research that, I mean, first of all, what, you know, what, what do you, what do you think about that? Is that a, is that a cohesive explanation or is it missing something and what does your research suggest?
01:10:00
Speaker
Well, I don't know if it is true that these political maneuvers are socially costly. And a lot of organizations and institutions and groups of professionals are really paying attention to
01:10:17
Speaker
Trust towards their institution like if you look at an organization like a commercial organization. They're literally like keeping track of data all the time They should they should see I mean like you can give the Budweiser example, but most of the time it wouldn't be something that
01:10:33
Speaker
have that kind of national coverage they should be paying attention to what we did this and this was the outcome or that we saw this outcome what could possibly explain it we know we did that really political advertisement maybe that's like so this is my i feel like i must be missing something because if there are social costs. Then people should be aware of those social costs and then you would think they wouldn't do it anymore but the fact that it seems to me like a lot of.
01:11:02
Speaker
science organizations like journals and professional societies, if anything, are becoming more political over time. It would have to be, I think, the case that there's actually a benefit. I apologize for what the incentives are there, which is that it's short-term gain and long-term costs and local.
01:11:26
Speaker
your local incentives in science are to be liked by your peers. Your long-term incentive is to at least be trusted by society who funds you, right? Right. Yeah, there's the iron law of institutions. Right, that's exactly right. But I don't know first. So the iron law of institutions is just essentially that people care more about their own status within their group.
01:11:52
Speaker
than they do about the success of their group. If you're the leader of Coca-Cola or something, you care about being the big dog in Coca-Cola, but you don't necessarily care if Coca-Cola is around 50 years from now and you're dead and you're not affected by their success. I think that issue exists in an especially pronounced way in academia because Coca-Cola and Bud Light can have pretty instant feedback if they piss the public off.
01:12:19
Speaker
Like we've been pissing the public off in a huge way for 10 years, right? We will continue to. And we will probably continue to. And eventually that is going to come back to get us. But it's, you know, it's a much slower. But the trust issue, it does seem like the trust has been decreasing over time. And I mean, some of that might just be the fact that we're like interacting with people on Twitter these days where we weren't like 20, 30 years ago, but
01:12:45
Speaker
If, if, if organizations are getting like, so for example, I'm part of SPSP or not anymore, but that's a big professional society and social psychology. And they've done various things like they considered moving the conference because of Roe v. Wade, because the conference was in Georgia. So they're clearly like saying, we disagree with the Supreme Court decision. Some members, I believe called on them to do that. So they're reacting to a subset. Do they get feedback?
01:13:12
Speaker
And no, uh, you know, the majority of our members, even though some of our members wanted us to move the conference because of, you know, Georgia might not have abortion when the conference is held. Do they get feedback and know, well, the majority of our members don't really care because they don't see abortion as central to our mission. Like we're supposed to be doing social psychology or something. Yeah. I don't know. I don't know. Like, and then maybe the president serve these limited terms. So.
01:13:38
Speaker
Maybe they really don't think more than a year, two or three ahead into the future. I think self-censorship is definitely part of it. The people in preference falsification, the people, I definitely have seen cases where people, leaders in academic units or institutions have made decisions that they thought were popular within their institution, but were not because of the few loud voices.
01:14:01
Speaker
Okay, I have time to ask you one more question. Here's what I'm going to do, to bring it all back in. You run this adversarial collaboration center. You've written about challenges in politicization and trust in academia and political diversity or lack thereof. How do you build an adversarial collaboration in an area where there's so little viewpoint diversity that you can't even find a person who would be on one side of it?
01:14:28
Speaker
Well, this actually will tie kind of everything we've been talking about together because I think you don't necessarily need, for example, a full-blown conservative to participate in adversarial collaboration because, as you noted, scholars are incentivized to disagree with other scholars. So when there are famous models or frameworks or other scientists in the discipline,
01:14:54
Speaker
They get rewarded if they challenge those scholars and those frameworks. And so even if you can't find someone who's going to disagree wholeheartedly across the board with what the majority of people in a discipline think, you will get what I have called in the past one issue, renegades. So these are people who will challenge just one thing.
01:15:16
Speaker
They agree 99%, but on this one thing, they're the person who's challenging other people. And so long as for every single issue where there is still a debate happening, you have a handful of people who are willing to tackle that issue from the heterodox perspective, if you want to call it that, then you can do an adversarial collaboration. And then in the context of the adversarial collaboration, now you have a more level playing field because it's just like
01:15:43
Speaker
one or two people here and one or two people here. And I think you potentially would have more power to change minds that way, even if you're in a very, very, very small minority. Well, that's a great hopeful note to end on. Not true if it's true in practice, but it is theoretically. Yeah. Well, Cory Clark, thanks so much for coming on the Free Mind podcast. I wish you all the best of luck with your adversarial collaboration project and I encourage our listeners
01:16:10
Speaker
to follow that project closely because I think it's going to produce some really exciting science in the next 10 years. Thank you very much and thanks for having me. This was fun. The Free Mind podcast is produced by the Benson Center for the Study of Western Civilization at the University of Colorado Boulder. You can email us feedback at freemind at colorado.edu or visit us online at colorado.edu slash center slash Benson.
01:16:37
Speaker
You can also find us on social media. Our Twitter, LinkedIn, and YouTube accounts are all at Benson Center. Our Instagram is at TheBensonCenter. And the Facebook is at Bruce D. Benson Center.