Introduction to Free Mind Podcast
00:00:08
Speaker
Welcome back to the Free Mind podcast, where we explore topics in Western history, politics, philosophy, literature, and current events with a laser focus on seeking the truth and an adventurous disregard for ideological and academic fashions. I'm Matt Burgess.
00:00:25
Speaker
Assistant Professor of Environmental Studies, and Faculty Fellow of the Benson Center for the Study of Western Civilization at the University of Colorado Boulder.
Guest Introduction: Diego Reynaro
00:00:34
Speaker
My guest today is Diego Reynaro. Diego Reynaro is Mine Corps Postdoctoral Research Fellow in the Annenberg School of Communication at the University of Pennsylvania. He studies how people's moral and political views change through conversations and social networks.
00:00:50
Speaker
He has also done research that challenges the idea that the predominantly liberal political views of academics affect the quality of research and the range of results published in his field of social psychology.
Claims of Liberal Bias in Academia
00:01:02
Speaker
Jumping off of this research, we discussed to what extent academia actually has a liberal bias, and in what ways claims of liberal bias may be overstated. Diego Reniero, welcome to the Free Mind podcast. Hey, it's great to be here.
00:01:20
Speaker
Yeah, I really appreciate you coming on. I've been looking for a while to do an episode with somebody who will provide evidence-based arguments that challenge or at least complicate one of the
00:01:37
Speaker
what you might call axiomatic assumptions of the Benson Center. Maybe even an assumption so axiomatic as to be central to the Benson Center's origin story. And that assumption is that academia is a left monoculture
00:01:55
Speaker
especially the faculty level, maybe especially the grad student administrator level, and that because it's a left monoculture, that biases the scholarship and teaching in some way.
00:02:08
Speaker
And so I reached out to you because we have a mutual acquaintance, or probably more than an acquaintance for you, Jay van Bevel, and you and he have done some really interesting work that I think does try to challenge or complicate this narrative.
Viewpoint Diversity and Steel Manning
00:02:25
Speaker
And what's particularly interesting is that I believe one of your papers was written in direct response to a paper authored by Corey Clark, who was a guest in this podcast a few episodes ago talking about this topic. So it seems like a nice full circle. I'm all about what heterodox academy calls the HXA.
00:02:44
Speaker
way, steel manning, viewpoint diversity, constructive disagreement. And so I guess as a start, can you just tell me about what your interest in this topic and for our audience who are not academics and probably haven't read your study, what did you look at and what did you find and what did you think it means?
Replication Crisis in Psychology
00:03:03
Speaker
Yeah, great. This is a great topic. Yeah, so my interest in this, I think, stemmed from a couple different places. When I started working on this, there was a huge replication crisis blow up in the field of psychology. And just for the lay audience, can you just explain what you mean by replication? Yeah, exactly.
00:03:22
Speaker
Yeah, totally. So when scientists do their work and they do a study, they publish it. And then the question is, is that legitimate? Is that real? And can another group of scientists and an independent group of scientists go in and try and do that same study a second time? And can they find the same results? And if so, it seems like that work might replicate. You can do it again a second time.
00:03:44
Speaker
And if not, then then that work might not replicate and then it becomes a complicated question of like, did the first study that found some cool result. Was that the truth or is the study that tried to do it again.
00:03:58
Speaker
that couldn't do it again, is that the truth? So it gets kind of into those muddy waters.
Political Slant and Research Findings
00:04:04
Speaker
But I got kind of drawn into this thinking about this through the lens of like, how do we do our science? What's good science? And, you know, what are potential flaws in the way that scientists do their work? And so kind of stemming from that,
00:04:21
Speaker
Whole point in in time we started to think about yeah, why is it that? Psychologists when you know social psychologists, which is which is my where my PhD is from Why is it that we having these issues of replication come up? You know, why is it that our science seems sometimes? Robust and other times less so so we had you know that and there's we can have a whole other podcast about that but the
00:04:48
Speaker
One of the claims that was coming up around the same time was around this kind of liberal bias and academia, but more targeted, I think, at the social sciences, like psychology, sociology, economics, those kinds of things. Not less economics, actually.
00:05:08
Speaker
In terms of the demographics, less economics. The faculty on average lean left in all fields, but econ and math and engineering are actually some of the closest to parity, only three to four to one as opposed to like 10 or 50 or 100 to one.
00:05:24
Speaker
Right, exactly. If you compare like history versus like, you know, business or economy or finance or something. So yeah, so we were thinking about this question and noticing that at the time there was a paper that had come out by John Hight and Jared Crawford and Jose Duarte and some others.
Research Replicability Study
00:05:52
Speaker
And arguing that essentially psychology was politically biased, and this actually, even though it was focused on psychology, and I think social psychology in particular, it garnered a lot of attention on
00:06:08
Speaker
in the kind of academic water cooler space, you know, Twitter and lots of people are talking about it. And so we were thinking about the claims that were made in this. And we started thinking, you know, like, you know, they spell out some pretty helpful arguments there, they present some nice descriptive data of how psychology has become more left leaning over time, and how the
00:06:29
Speaker
There's many more liberal leaning psychologists than there are conservative leaning psychologists in how that ratio has only gotten more egregious in the past few decades. And so we started thinking about, well, what does this mean? And can we take the argument that they're making and try and get some actual data to test it?
00:06:46
Speaker
And I think one part of the argument that they were making was, again, that sort of the fact that you have this homogeneity, just a field full of liberal-leaning psychologists is bad. And bad for a few different reasons, one being that it kind of undercuts the robustness of the research being done. And they provide a few different examples, sort of anecdotal examples of how you might ask a survey in a leading way, or you might
00:07:15
Speaker
you know, name of, uh, finding a certain way that, and so they, and they kind of, they go through a couple different examples, but we wanted to look at whether the kind of political leanings of research was actually implicated in how robust and replicable that research was. So what we did was we, um, we collected about 200 papers in psychology that had
00:07:41
Speaker
replication attempts available, meaning we had 200 papers where some team of scientists had done some original research. And in addition, another group of people had tried to do that same study a second time and see if the results held up. So we had this kind of nice data set available. So we kind of pulled that all together. And then we had a group of people rate whether the
00:08:06
Speaker
title and abstract of the work was liberal leaning or conservative leaning or moderate in nature. And we had the folks doing this rating come from across the political spectrum. So we got some liberals to do some ratings, we got conservatives to do ratings, moderates, so forth. And what we were curious was, does the political slant of
00:08:27
Speaker
the research, the sort of the findings that are coming out is that related to whether or not this work is really robust and whether it replicates. And just to kind of like make really explicit, I think what we were, why you might expect that is to say that like if, you know, if you, let's say you publish the paper and it has a claim and the finding of it is something that you could see really supporting like a liberal narrative, right?
00:08:55
Speaker
that if you find that result and you submit it to a journal to get published and the journal sends it out for peer review, meaning they send it out and a few other expert scientists in the field look it over and give it some feedback and review and see if it's up to snuff or not to get published.
00:09:10
Speaker
That if you have a paper
Media Influence on Research Coverage
00:09:12
Speaker
that has this liberal leaning kind of finding that it might be treated more leniently by a group of reviewers who again as we talked about might be also liberal right because the field itself is sort of very liberal field.
00:09:25
Speaker
or like the political leanings of the people in the field are liberal. And so you might say, hey, you know, it's going to be treated fairly leniently if the claim being put forth is sort of liberal friendly. And it might be treated a lot more harshly if it's a more sort of a conservative leaning finding or something.
00:09:43
Speaker
And so then the issue that could arise is if you're treating findings that are more liberal leaning more leniently, as you might just like stuff that shouldn't get published because it's not really robust, it's not really well done, just gets through and gets published in the scientific literature just because it had
00:10:01
Speaker
Sort of a claim in it that aligned with how you view the world. And so that's that's like why you might end up why you might hypothesize that you could end up with, you know, liberal leaning papers being less likely to replicate because because of this kind of process where like this bias could occur, people are more lenient with those kind of work, they shot your work kind of gets through.
00:10:22
Speaker
So that's like one example of how that might play out. And so we collected all these papers. We had folks rate the political slant of these findings. And then we could test whether the political leanings were associated or not with replicability. What we found was that the political slant of the research was not associated with whether or not it replicated. In other words, we didn't find any evidence that that mattered.
00:10:45
Speaker
whether a paper had more of a liberally finding or more conservative leaning finding or moderate, did not seem to be very predictive of whether or not that paper replicates. What is predictive of that are basic scientific things like how many people were in your study or how big of a finding was it? How big of an effect did you see in your study? And those are the things like kind of obviously that you might expect. Those are the things that stand up the time, test the time of like those are the things that are related to whether or not this work replicates.
00:11:15
Speaker
And so it made us think like, well, you know, maybe there are various different forms of political bias that can emerge or liberal bias that can emerge. This doesn't seem we don't seem to have great evidence that this is one of them here, but it plays out in this way. And the other kind of key thing I'll mention and then I'll pause here is
00:11:35
Speaker
I think one of the consequences of the conversation that was happening around that time, again, was that, and then as you sort of noted at the start here, is that the field is very liberal leaning. And so there's this thought that the bias in the field is pretty egregious, right? Because you might just think, oh man, it's a field full of liberals. It's going to be like totally just overwhelmed by that group think. Everyone's thinking the same way.
00:11:59
Speaker
And one thing that we found in our data here that I was just describing was that when you looked at kind of just the distribution of papers that were left leaning or right leaning, you might expect that the distribution of the research like
00:12:16
Speaker
the findings that are getting published would really map on to the skew that you see in the field
Bias in Academia vs. Media
00:12:22
Speaker
itself, right? That you have like, like you were like, it's 50 to one, you know, Democrats or Republicans on a certain. So you might expect that skew to be like, so skewed. And we didn't actually find that we found that there were more slightly more liberal leaning findings than conservative leaning ones.
00:12:37
Speaker
But it wasn't nearly as egregious as it seemed folks were Making it out to seem at the time and and did not map on to like just the numbers of people in the field So all to say even though there were men even though there are many many liberals More liberals in the field in the field of psychology and in other disciplines we talked about It doesn't seem to always one-to-one translate into like the research that they do into the slant of their research and so surely there is some degree of
00:13:07
Speaker
you know, we are all people and we all have our own interests, but it didn't seem to be as biased as it was made to seem at the time. So that was kind of one effort that we published a few years ago. Yeah. Well, first of all, you anticipated what my first followed question was going to be, which is exactly about what was the distribution of liberals, the conservative findings. What I love about this is that your
00:13:34
Speaker
You know, it's easy to take the whole issue with a broad brush and kind of say, Oh, you know, it's all, it's all bias. It's all bad. Or, you know, what, what are you talking about? There's nothing wrong. And what's cool about this is, is it seems like you've, you've narrowed in on kind of a specific measurable aspect of this and tested it in a rigorous way.
00:13:54
Speaker
I want to pull two threads here. I'll start by saying, I find your results about bias, I find actually not that hard to believe at all. It's one of those things that maybe if you're having a water cooler conversation about like, oh, there's so many liberals. But if you think about it, it's like, yeah, methodological rigor.
00:14:18
Speaker
You know, anonymous reviewers who are yet not facing social pressure at the time that they're reviewing, for the most part, are asked to specifically scrutinize the methods, right, and the regular of the methods. And so they're likely to do a much better job at that than, say, the media.
00:14:38
Speaker
So basically off the top, I'll say I believe you, two threads I want to pull. One is related to kind of what this implies and what this doesn't imply in terms of those kind of specific charges that people lay against academia and get your thoughts on that. And then the second is kind of steel manning the study itself as sort of, you know, what would somebody who either is surprised by or doesn't want to believe your findings, like what might they say?
00:15:06
Speaker
Okay, so let's start with, I think the more interesting one, how narrowly or broadly should we interpret this? So the first thing is, correct me if I'm wrong, you did not look at the journal acceptances, right? And you did not look at how they recovered in the media.
00:15:28
Speaker
or did you? So we didn't look at journal acceptances. We sort of looked at how they were covered in the media. So there's something called altmetric, which is basically a cool tool to track media citations across different platforms. And we did look at the association between the political slant of the research and that measure of
00:15:48
Speaker
sort of publicity in the media. And we again did not find any associations there. What you find is like things like, you know, Danny Kahneman's, like, you know, Nobel Prize winners work on, you know, sort of like heuristic thinking and availability biases and stuff like that. That gets tons of attention or work on like cooperation and stuff that's like not clearly liberal leaning one way or the other, like, or politically leaning one way or the other. That's stuff that gets like tons of attention.
00:16:15
Speaker
So yeah, that it's that seemed to be the things that we're getting those media so we looked at media but not a journal so let me try I'm not a social psychologist, you know, I Actually have been in a couple papers recently collaborating with people who are social psychologists, you know in almost quasi social psychology type work But I met you know, I'm an economist by training climate is one of my main areas currently and so let you know having not studied this systematically like you have let me just kind of I
00:16:42
Speaker
try out what I think are my priors based on my anecdotal experience in climate.
00:16:49
Speaker
and tell me to what extent you think they do or do not map on to what you see in your data and what you see in social psychology. Broadly, what my guess, the answer would be if you did a study of these different dimensions in climate, is that similarly to what you said, you wouldn't see nearly as much bias as some conservatives might think in the methods, particularly in physical climate science. One of the things that actually frustrates me about the right of center dialogue about climate is
00:17:16
Speaker
lots of things to pick at in terms of framing and media and like doom ism in like the social domains of climate. But the physical science of climate is really solid. Okay, so so that's kind of one thing. But where I think you would see biases is I do I do suspect and an acquaintance of mine named Patrick Brown.
00:17:41
Speaker
wrote kind of a viral article about this a few months ago, making this case, you know, also anecdotally, but sort of more rigorously than I will, that it's easier. It's the threshold of like, you can never publish this anywhere versus you can publish it somewhere. Maybe isn't that different for, you know, right coded or left coded papers, but the threshold of publishing in a really big journal
00:18:07
Speaker
for methodological rigor might be a little bit lower for ideologically pleasing papers. And so Patrick Brown gave the example of a paper that came out in Nature that, if I'm remembering right, compared the economic impacts of one and a half and two degree targets, but didn't look at the different mitigation costs. And so sort of said one and a half is better, but kind of only looking at half the ledger.
00:18:30
Speaker
And then Patrick Brown, the person writing this article, tried to submit a paper that looked at both half ledgers and came to the opposite conclusion that two is better than one and a half, and nature rejected it. And I think he ended up publishing it in plus one. So he just uses that as an anecdotal example. I find that plausible. I think the media bias
00:18:52
Speaker
also exists, but unlike academia, which is very left leaning, there are large media echo chambers on both sides. So if you want a kind of hilarious example of this, there's a single author at the Heartland Institute, the same person.
00:19:07
Speaker
Two years ago, when I published a paper that argued that the hot climate scenarios were unrealistically hot, she wrote this article about my paper that basically, in slightly more words, said I was this brave truth teller correcting the record.
00:19:24
Speaker
And then just recently, I had another paper come out that estimated that the public distaste for climate denial might have been large enough to cost Trump the 2020 election.
Media Narratives and Scientific Perception
00:19:35
Speaker
And then she wrote the same person wrote this article, again, both cases naming me personally, and this time saying I was a democratic shill, I was trying to manipulate voters.
00:19:44
Speaker
And I had a lot of fun pointing out of the same verse and wrote these two articles. But to zoom back out from that, I would say I've published climate papers that some that are more left-coded and some that are more right-coded, and I find that
00:20:01
Speaker
I don't find a difference at all in how much coverage I get, but I find a huge difference in kind of who covers it and how they cover it, how do they cover it, right? And I do, you know, while I'm very conscious in emphasizing and policing the rigor of my own work, I also, you know, I find media, the way media interacts with my research on both left and right, they tend to be pretty credulous
00:20:31
Speaker
They tend to be interested to the extent that it fits their narrative, and with a few exceptions, but in most cases, they tend to be pretty credulous about the methods. And again, I think I use pretty good methods, but most of the time I'm not being pushed that hard on either side. And they're not scientists trained scientifically, so it makes sense that they wouldn't dive in as much.
00:20:51
Speaker
And so I guess what I'm saying is if you think about how that translates to altmetric scores, if I think about my most cited in the media right coded paper and my most cited in the media left coded paper, I think they both have altmetric scores in the ballpark of 600.
00:21:10
Speaker
And so there's kind of, there's, I would say it's like, it's two biases that cancel each other out in a way that, you know, wouldn't, wouldn't show up in your data, or the cancel, the fact that they cancel out would show up in your data.
Journal Acceptance and Scrutiny Bias
00:21:20
Speaker
Because it's not that there's no media bias, it's just that media bias exists on both sides. And there's, you know, within a particular media organization, there might be more liberals and more conservatives, but
00:21:29
Speaker
Overall, there's roughly equal number of liberal and conservative media outlets. Same thing on social media. What's your thought on the journals? Imagine somebody sent you Patrick Brown's article about climate and said, I bet this happens in social psychology too. What would be your answer? Yeah, so the idea that if you send a paper to a journal that's
00:21:54
Speaker
like more controversial to like a liberal narrative that it would be held to a higher bar. Yeah. And vice versa, right? If something was really pleasing to, you know, like if you want an extreme example, this isn't quite social psychology, but there was a paper that came out, I believe in science a few years ago that looked at, you know, people all over the world to see, um, you know, where could you find like evidence of hybridization with Neanderthals?
00:22:22
Speaker
And what they happened to find was that the only population that did not have any evidence of hybridization with Neanderthals was Africans. And I remember the time some people saying, would science have published that if the finding had happened to be the opposite? Yeah. I think your examples are well taken. I think the journal piece of it is tricky because I think
00:22:47
Speaker
There's a lot of things that are going on in that peer review. You're going to get different reviewers. Sometimes the reviewers really don't even agree with themselves. The reliability of peer reviewers is super low. Everyone complains about reviewer two because reviewer one is like, I like this. And reviewer two is like, this is the worst paper I've ever seen. And then ultimately, it's the editor who's going to look at those reviews and sort of
00:23:07
Speaker
take a big picture state stance of, you know, should we send it? Should we get it? Can you revise this paper and send it back to us or are we chopping off the knees and sending you on to the next journal? Because we don't want to publish you here.
00:23:18
Speaker
So yeah, I mean, I think I could see I mean, I think I agree that like there are going to be forms of bias that emerge on both sides and certain degrees in certain cases. I don't know if it's like so systematic that I think you I think it's sort of on the on the edges. Like you'd have to have a pretty controversial paper.
00:23:39
Speaker
For that bias to kind of come up and then it makes me think of like well why is it so controversial is it controversial because. It's just like a topic to talk about or is it controversial because like. It's not maybe it's only taboo but also like it's fine in the face of like what we currently know right so if you published that were like.
00:23:57
Speaker
That were that was like, you know women aren't as smart as men or something and then you're like We know women have like outpaced men in earning college degrees for the past like 40 years or whatever Then I would be like, well, I don't know I should like take a closer look at this paper that you're writing because like it's contrasting my priors which are informed by you know the scientific evidence and so it's sort of like to what degree are your priors based in
00:24:21
Speaker
the literature that you've read and your kind of more empirical understanding of the world versus as a reviewer, are you just sort of like, oh, they're saying, you know, that, I don't know, yeah, that like women aren't able to do math as well as men or I don't know, something that like you're some other sort of claim where you're maybe like wondering,
00:24:42
Speaker
If that's like, so if that's like completely invalid or not, but I can't think of one off top man, but, um, so you basically like, just to break this down a little bit, your, and actually first quick sidebar, correct me if I'm wrong. My understanding before anyone in our audience misunderstands what you were saying a second ago, my understanding of the literature is that the mean IQ of women and men is basically identical. Um, yeah. And the, and women do tend to do better in school on average. And I think that traces to largely their, um,
00:25:12
Speaker
on average more conscientious, particularly when they're younger because they're mature faster. Some people like Richard Reeves argue that school systems are more tilted than them or whatever. We don't need to get into the debate. But just to be clear, I think your point was, which is well taken, is that men are not smarter than women, but I just want to make sure
00:25:29
Speaker
Give you the opportunity to verify you're also not saying that women are smarter than men, right? Yes, exactly. Yeah. Yeah Okay, so that's so bringing in dude in case that was yeah. Yeah, you know, you never know who's who's gonna be listening, right? The and by the way
00:25:46
Speaker
All ideas are welcome. If you did want to make that case, please do. Don't feel like I'm shutting you down. Let's pull two threads there. I think at the start, what you described is very rational Bayesian reasoning. My favorite example of this, because it has
00:26:05
Speaker
Absolutely no social controversy whatsoever was there was a famous paper that i think came out in science i think science maybe nature that claim to find an arsenic based life right.
Sensationalism in Scientific Publishing
00:26:16
Speaker
And it turns out that the paper was wrong i think it was retracted and i can't remember.
00:26:23
Speaker
I want to say it was an honest mistake, but I can't remember why it was retracted. But basically, so it came out in Science, and this is an example where sensationalism bias can actually work against confirmation bias, because we did not find the arsenic-based life form paper doesn't get published in Science probably. But also, people were kind of immediately incredulous about it because
00:26:47
Speaker
there was decades of evidence from molecular biology that, you know, that I'm not an expert on, but I gather from reading about it suggested to experts that this was a very, very unlikely thing to find. And, you know, so similarly, if I saw a paper that said that, you know, climate change isn't caused by CO2 emissions, it's caused by the sun, I probably, you know, somebody who prides myself on being politically balanced and not only biased would still probably read that paper more carefully.
00:27:17
Speaker
And a related, and then the second thing you were describing is like people not applying Bayesian reasoning, the people saying like, this just like offends me, right? And occasionally you do see that. So my favorite example of that, cause it's like a accidental control experiment is there's a, a researcher, I think her name is Badour Al Shebley. She's it.
00:27:43
Speaker
Anyway, so she had two papers that came out a couple of years apart in major communications. And they both used essentially the same methodology. And one headline finding of one of them was that ethnically diverse scientific teams are better. I can't remember what the measure was, probably citations or something, innovation. Again, I can't remember the measure, but just if it was ethnically diverse scientific teams, do better. And that was published to large fanfare. Everybody loved it. Then a couple of years later, she had a paper
00:28:14
Speaker
Also in nature of communication, so same author in the same journal whose headline finding was something along the lines of female trainees who have female mentors, like graduate advisors, don't do quite as well in their careers on average as female trainees who have male mentors.
00:28:33
Speaker
And, you know, just for what it's worth, you know, so a lot of people were offended by that finding and the paper I believe did get retracted and she put some torture apology out, ostensibly because of the methodology, but of course, nobody asked to retract the other paper that used the exact same methodology, right? So clearly an example, I think, in that case of bias. And what's interesting that too is like the
00:28:55
Speaker
My interpretation of that, I think there was a more plausible and less inflammatory interpretation of that same result, which is that I believe it's the case that studies, I think I've seen a study in ecology that suggests that female trainees are more likely to select a female mentor because she's female.
00:29:17
Speaker
And so if that's true, right, if female trainees are selecting female mentors, on average, more for non-excellence-related reasons, then irrespective of what the distribution of talent is among the mentors, you would expect, I think, to find the result that she found, right?
00:29:40
Speaker
Which then also, you know, I think, again, I think that's more plausible and also less, less offensive, right? People were the thing that people found offensive was that, you know, that there was this implication that, that female, um, the female mentors weren't as good or something. So I also want to just, so what do you think about cases like a case like that? Yeah, I was just going to jump in and say like, I think.
00:29:59
Speaker
It's complicated. I think that, for example, you were contrasting the ethnically diverse teams do best with the female mentor work and saying that was a clear case of bias. And I just want to say, I feel like that it seems less clear because I think it depends on your causal understanding of the world and depends on these specific measures. And so I haven't read these papers
00:30:23
Speaker
very closely, so I won't hang my hat too heavily on it. But it's like, you know, I think there's been a lot of work that's shown how women are discriminated against in STEM, and that's
Curiosity in Social Psychology
00:30:36
Speaker
a very challenging kind of environment sometimes to be in. And so to me, it feels like, you know, that could be true about what it means to be like a female academic.
00:30:48
Speaker
And the ethnically diverse piece of it, I was just having a conversation with a colleague yesterday, actually, about like, collective performance in teams and like, when is it better to be diverse versus not and, and it's like for simple coordination tasks, it's actually better to be homogenous. But if it's like more creative solution stuff, it's so I think it's like, I'd have to look at what the exact measure is for that. But I guess I could say I guess, I guess I should look more closely at the at that to evaluate. But I don't I wouldn't at face value say look, obviously bias.
00:31:15
Speaker
Well, let me clarify my claim of bias. So my claim of bias, I think basically there were two things that were going on. So the people on Twitter, when it was still Twitter, when this paper came out, were saying, this paper offends me. That's basically a moral claim. I'm happy with people saying that the moral implications of the second paper were more inflammatory than the moral implications of the first paper.
00:31:44
Speaker
My claim of bias is separate from that. What happened after the Twitter storm was the journal basically, I think, asked the authors to correct and eventually retracted the paper, not because it was offensive, but because people said, this methodology is too simple and not appropriate for the claims you're making.
00:32:06
Speaker
And that's where I'm claiming bias, is that if that was true, that would also, I think, be grounds to retract the other paper. But nobody suggested they do that and they weren't asked to do that. And so the bias interpretation of that story is that a different standard of methodological rigor was applied to the two papers because, on average, people had a different moral reaction to the implications of the two findings.
00:32:30
Speaker
Right. And I think you make a good point about the subjectivity of those standards, because you could view it and say this author's second paper was, I don't know her or the work super well, but you could say that second paper that came out wasn't very strong science. And so, no, it didn't really deserve to be published. But then you're saying, by inference, the first paper should also say, well, that wasn't really strong work either. Or if you're saying the first work was sufficiently strong enough,
00:32:58
Speaker
then I guess the second work should reasonably be also strong enough. That's the case I'm making, yeah. Yeah, I'd have to look more closely at the methods of those papers to make my own more informed judgment about that. But yeah, yeah, yeah.
00:33:11
Speaker
Okay, and this actually leads naturally into one of the other questions I was going to ask you in general about the bias topic. So my impression of social psychology as an outsider is that although I think demographically you are super politically homogenous, I think it's fairly well established fact, I actually think social psychologists probably because of the nature of your field are a lot more curious.
00:33:34
Speaker
about politically diverse questions than many other equally homogenous fields are. I don't think it's a coincidence. Do you know Smriti Mehta? She's a graduate student at Berkeley. Anyway, I had her on the podcast a while ago. And we were talking about this. She's a member of Heteros Academy too. And I was saying to her, I don't think it's a coincidence that
00:33:58
Speaker
most of the founders of heterodox academy and many of the prominent voices in heterodox academy come from your field, right? Because because the phenomenon of political polarization and liberal bias, if it whether or not it exists, it's just like inherently fascinating to the social psychologist. And so even if like, this is the answer, you kind of want to study it. So my question is, is like given that you found that there actually was quite a large
00:34:26
Speaker
diversity of studies and you said you had a politically balanced reviewer or raider set. The raiders, yeah. Is the problem of like question selection in social psychology
00:34:41
Speaker
more limited to the fringes than to the mainstream, right? To give two examples to illustrate my point, if I was a grad student starting out in social psychology, might I be okay if I wanted to study the effects of stable families on mental health or something, but maybe not okay if I wanted to study something like race and IQ? Is it really just the really, really, really toxic stuff
00:35:09
Speaker
on the fringe that's getting shut out or is there kind of a broader limitation of both what is the charge that you're hearing coming from heterodox sociologists and kind of what would be your guess of what the answer is.
00:35:24
Speaker
Yeah, I think you're right that it's less like I know a number of folks who study family structure or the importance of that as a unit, which you could say is more conservative coded, more traditional in its values. So I think that those number of people study romantic relationships and families and
00:35:46
Speaker
they might look at its negative associations or they might just look at what the outcomes
Framing Research Questions
00:35:51
Speaker
are. I think what's important is like what do you the key is like what are you measuring right because oftentimes you can find I don't want to say you can always find what you're looking for because you can't always find that but you can often find some element of something so if I were to look for example of like a negative association of
00:36:09
Speaker
you know, broken homes, so to speak, you know, families torn apart and how that affects the youth and their education, no attainment or their mental well-being.
00:36:20
Speaker
you could pick a number of different outcomes. And probably on one outcome, you might see that there's some sort of negative effect. But then you might pick another outcome and recast it and say, you know, we're going to look at resilience, we're going to look at the people's ability to overcome the weight. And all of a sudden, you find that, you know, this other outcome, people, you know, actually, they show some, you can, when you frame it this way, and you think about it from another perspective, you realize that this other outcome is positively affected by it. So I guess I'll just say that, like, I sort of feel like there can be positive and negative outcomes. And it's a lot of it is like, what
00:36:49
Speaker
It's spelling out very clearly like what is your theory about what is your hypothesis? What are you measuring? What can you speak to? What can you not speak to? Acknowledging that there are like various complex factors that go into this and it's not just always a straightforward story of like this thing is good and this thing is bad. It's often like a little bit more complicated.
00:37:07
Speaker
In a sense, your work on political bias in academia kind of illustrates this. If you sort of narrow down on a specific thing, to take a simple example, if you just look at the demographics of faculty, it's obviously true. If you look at the specific measure of impact that you looked at, it doesn't seem to be there much. So is your point that that's
00:37:29
Speaker
that what we pick and what we frame is an area to look for bias where maybe you haven't found it or was it more specifically to push back on the conservative coded notion that maybe was implied in my example that stable families on average leads to good socioeconomic outcomes.
00:37:51
Speaker
Yeah, no, it was mostly to say that I think it's that the viewpoints you were I think you were asking about view like extremity of viewpoints and sort of like what's yeah are basically other social psychologists who study some of these topics or is the topic range so so like small because it's such a liberally homogenous group. And I was saying I think there is a range I think they they're, for example, there are probably more social psychologists who study things around like stereotyping and prejudice.
00:38:18
Speaker
which is often thought about in the way that our society works, which is stereotyping prejudice towards, you know, marginalized folks. And you could also study stereotyping and prejudice towards, you know, powerful CEOs and say, Oh, we're conservatives, right? Or conservatives that you mentioned it. Yeah. Right. So like, yeah, you can pick different people to study with that. But I would say that that you probably have more folks studying that than you do who study like, you know,
00:38:44
Speaker
more conservative coded topics, but I don't think the range is so small. And I think the, where you start to see, uh, you know, where you, where you'd probably start to see people shut off is like where you were saying this kind of, um, you know, race and IQ type implications or work that people are doing that it seems kind of, uh, beyond the bounds of, of what we're from our, from our, from our scientific data, like what we're.
00:39:10
Speaker
Interested in studying based on what we know. So I think that's kind of where I would draw it So I think that's kind of the point. Yeah, I want to ask you a quick follow-up question about that because I'm thinking of my episode with with Cory Clark and you know, I think of myself as a Advocate against political bias for open inquiry and she maybe even is more extreme than I am kind of in that direction You know kind of the like study everything And so so so one of the things that one of the questions I tried to push her on is like, you know, I
00:39:39
Speaker
While it does seem intuitively scientific to say, given that we're going to ask a question, we should never say a particular answer is out of bounds, because that's going to pollute the fact-finding process. So in that example, that scientific paper about Neanderthal hybridization, if we had imposed
00:40:00
Speaker
If we imposed a filter on papers like that, that say, you know, you can only publish it with the correct answer, right? For example, you know, the, the Badour El Shebli, excuse me, example of female mentors, right? Like, like if we say, we're not going to retract that paper, if it has the correct finding.
Controversial Topics in Research
00:40:18
Speaker
Personally, I see the case that that's inherently scientifically corrupting.
00:40:22
Speaker
But maybe more so than more pure academic freedom advocates than I am. I do see a case for saying,
00:40:32
Speaker
and narrowly, but still for saying that there are some questions. There's nothing, I don't think inherently polluting, at least the same degree, about saying that some questions maybe should be out of bounds. So the extreme example that I think is super uncontroversial is like, we don't want people doing open source bioweapons research in universities on the public dime.
00:40:54
Speaker
Even closed sorts gain a function research has become super controversial, I think, for a good reason, if you think about the recent stuff about COVID origin. So I think if you accept that, if you accept that there's some limit on questions that we allowed to ask, then I think it is reasonable to take some of these really inflammatory questions like race and IQ and ask, you know,
00:41:18
Speaker
Is that a question that basically nothing good can come from trying to answer in terms of society? And if we did, would it not be reasonable then for scientific enterprises to say, we're not going to fund research on that question and we're not going to hire if you study that question? I don't particularly want to talk any further about that question because I just don't like it.
00:41:44
Speaker
I believe in academic freedom. I think if people have already been funded to study that question and they publish a paper, you shouldn't fire them. That's a different question, right? But I personally have no interest in that question. But what do you think about the general argument that even a pretty pro-free speech stance can still admit that there's a line
00:42:08
Speaker
that puts some questions out of bounds. Do you agree with that or not? And if you do, like how should we decide of sort of where to draw that line?
00:42:17
Speaker
Yeah, man, what a great question. The question of where to draw that line is a very, very hard question because it sort of depends on who you ask and what you think the outcomes of it will be and how negative the outcomes will be will depend on who you ask. And so it's like a super thorny, complex question. I think I definitely need to do your first question. I definitely think you can have people who are
00:42:40
Speaker
who are people who are open-minded, who are curious people, who try and take the... I mean, this is a free-mind podcast. People who listen to this might also be people who have open minds of inquiry and curiosity. And I think it's totally true that you can have that and still find an outer edge where you say, this seems like a question that
00:43:03
Speaker
Could be problematic for a lot of reasons to dive into and starts to maybe get into trolling spheres where you're asking these questions to seemingly rile things up more so than because you think that there's some.
00:43:19
Speaker
amazing scientific understanding that we're going to draw. And again, this becomes very complex in social sciences in a way that's sometimes, I think, different from physical scientists. If we were trying to understand the universe, which we are,
00:43:35
Speaker
I don't know how I would cap it for astronomers or physicists. And I would say, no, don't ask that question. I would say, I don't know. Let's bend around the experiments and the laws and see what you can figure out, because that's useful. But when you move into social sciences, it becomes much trickier. But I think that your first point is well taken, that people can have open minds of inquiry and curiosity about things, but still draw lines. And where to draw those lines, like I said, that is a very
00:44:04
Speaker
complicated question because, you know, your example of, you know, bioweapons are missing weapons of mass destruction and it's like some things you could see very quickly having an immediate mass catastrophic outcome. Some things you might say, well, this would be sort of like problematic, but like it wouldn't lead to 8 billion deaths.
00:44:25
Speaker
And so again, it's like, well, how do you want to quantify this? And are we in a position to sort of set the like moral lines of what's okay and what's not ourselves? And yeah, I think that's I don't I don't think there's a clear, I don't feel like there's a clear answer of exactly
00:44:43
Speaker
where to set that line because I think in part what we decide is okay and not okay is a very social thing. I also studied morality and I know that that is a very social process of us working out together what's okay and what's not okay. I don't know, where would you draw the line?
00:45:01
Speaker
Well, you're totally right. So, so, so, uh, you're totally right. It's a social process and that anticipates me to follow the question, but first let me try to answer your question. Cause I think it's a great question. So I guess I would try as much as I could.
00:45:14
Speaker
to anticipate the distribution of benefit and harm to society of studying it with some game theory considerations too.
Game Theory and Societal Impact
00:45:28
Speaker
So for example, I don't like nuclear weapons, but I'm glad that we have them because China is not going to get rid of them if we do. And I think that there's some
00:45:38
Speaker
know, if more of a scientific domain, I think some areas that get tricky in that domain is like, if you think about AI, or even things like designer babies, you know, genetic enhancement, like, I'm pretty sure China is going to do those things, whether we do them or not. And so I do factor that a little bit into my thinking about whether or not we should do it, even though like America was an island, the cost benefit might be different. I think
00:46:04
Speaker
you know, not to return to the topic that I hate, but I think where I'm sympathetic to the people who, who don't want study of that topic or raise an IQ is that as far as I've seen articulated the biggest harm that could come from studying that is that we have these catastrophic
00:46:24
Speaker
outcomes in our education system in some of our most disadvantaged communities, and people might stop caring about that. And I find that pretty plausible that if we widely accepted the, say, Nathan Kofnas' view on that topic, that people might... I mean, Charles Murray, I think, literally made that connection directly in terms of his policy recommendations in the bell curve, right?
00:46:46
Speaker
And then on the other side, what I hear, you know, people saying we should study that say is their biggest concern is that, you know, if we didn't study that and kind of find what they think the answer is going to be, then we're going to assume that, you know, equal outcomes is the only possible option.
00:47:04
Speaker
without discrimination and so we might discriminate unfairly to produce outcomes that we kind of aren't the correct null hypothesis. And that I actually find pretty implausible because there's many, many other paths to get to. We shouldn't discriminate and you might not expect to perfectly balance demographics in every field that don't require making assumptions about their research on IQ.
00:47:25
Speaker
So that's sort of an example of, you know, again, I'm more uncomfortable with somebody like saying, okay, we, you know, in case of Nathan Koffnats, who I think was fired recently in Cambridge, right? If you're going to bring somebody in who says, I'm going to study this topic, right? And you bring them in.
00:47:44
Speaker
And then you don't like what they find, and then you fire them. I'm against that, right? And so I was glad to see, you know, people across the spectrum, you know, luminaries in Cambridge, basically saying, you know, he should, he shouldn't have been fired on academic freedom grounds, even though, you know, his work is icky, right? It's kind of what they said, right? Right.
00:48:02
Speaker
But I'm totally okay with Cambridge saying, you know, we're not going to have a position on that topic because that's just not it. And by the way, I may be making a value judgment there, you know, even though I tried to frame it in like a benefit to society and the expectation, but we make that kind of judgment all the time in terms of deciding what's interesting, right?
00:48:22
Speaker
Um, you know, like if I send a, uh, thing that says, you know, I want to study, you know, how fruit bats see color and somebody's like, well, I'm not interested in that. Like.
00:48:33
Speaker
I'm not going to go like bias, right? But let's come back a little bit to the social thing because I think an argument or maybe a counter argument or maybe a rejoinder that conservatives would make to what we've just been discussing that I think I am somewhat sympathetic to is that we do hold
00:48:53
Speaker
In some fields, I actually think less social psychology than most other social sciences, but in some, and except maybe econ, in some social science field, it seems like we do hold right leaning or right coded questions to a much higher level of scrutiny for danger than left. So let me just give you two anecdotal examples to get your
Workshop Controversy and Political Undertones
00:49:13
Speaker
thoughts on. So the first is on the right,
00:49:16
Speaker
I was involved recently in planning what was going to be a workshop on the interdisciplinary consequences of human population decline. Like if, you know, birth rates stay low, population might peak in mid-century, that upends models and thinking in a lot of different disciplines, right?
00:49:35
Speaker
And not only were we not going to have a Malthusian hardcore or women should go back to the kitchen kind of person, we literally had one of our speakers invited to criticize that. And we just were interested in like, look, for example, as an economist, I invited economists to come and talk about how
00:50:02
Speaker
population decline breaks endogenous growth models, right? Chad Jones, he wasn't our speaker, but he's somebody at Stanford who's written really influential papers on that. Okay, that workshop was canceled after the speakers had been invited, basically because of a, somebody had put, there was a call put out for internal abstracts, right? To kind of fill out the program beyond the keynotes.
00:50:30
Speaker
And one of my co-organizers had put a banner photo on this call for abstracts that was a picture of a Danish advertisement from the government trying to raise their birth rates. So imagine a bunch of Danish babies with some Danish writing.
00:50:45
Speaker
And what I gather happened was somebody complained to the person pulling the purse strings that white babies on a thing about population decline means that this must be a far-right fascist project. We have to repopulate the earth with just white babies. That was the inference that people were drawing.
00:51:07
Speaker
I guess, I don't know. This is all things I heard secondhand, but the end result was the person pulling the pull strings after we had a date and confirmed high profile speakers pulled the plug in the whole thing and uninvited everybody without basically consulting most of the organizers. That's one example.
00:51:26
Speaker
And also without asking for clarification on like, are you planning to have a bunch of, you know, far rights natalists? Right. Yeah. So now contrast that with, I believe it's the case that one of the most assigned and widely cited books in education, which is maybe the field where you'd want to, you know, worry about danger the most because you're impressionable children is Paulo Freire's pedagogy of the oppressed.
00:51:54
Speaker
Now I've read that book. It literally praises Mao, who's maybe the deadliest dictator in human history. Paulo Freire has a long record of praising Mao and other communist dictators. He also went to implement his ideas in Guinea-Bissau and basically wrecked their education system. And so I think a lot of conservatives look at that and basically say,
00:52:14
Speaker
Like, is there not bias there? Is there not? And at what point is kind of a potential harm, you know, used in kind of an imaginary way as a cudgel, right? Like the, you know what I mean? What's your thought?
00:52:30
Speaker
Yeah, I mean, well, so first of all, sorry to hear about the workshop getting pulled. That sounds like a total miscommunication of things going on. It's interesting to think about the human population decline and what, like you said, it speaks to all different sorts of fields. So to me, that seems like sort of, I would be interested to hear what folks were thinking about.
00:52:51
Speaker
The fact that the purse strings were pulled on it and the thing I canceled, I assume, I don't know the details, but was it just like the organizer was worried politically of how this would look? It also seems like the organizers. Yeah, it was again, I like calling phenomena out, but I don't like people calling people out and that's why I'm being a little bit vague.
00:53:16
Speaker
There's a unit that had a call for workshop proposals that this was submitted to and reviewed and funded. And somebody who was neither the head of this unit nor part of the organizing team saw this poster and panicked and basically complained, again, didn't come as far as I know to the organizing team but went to the boss and said,
00:53:44
Speaker
No, I don't think they said, as far as I know, no, they didn't say like, oh, these organizers are bad people, you know, trying to bring fascism into the university or something. But they basically said like, this has undertones of the far right, that kind of pose a huge reputational threat to our organization. And and so we shouldn't do that. And I think in my hunch is that the leader who was new to the job at the time might make a different decision today. But at the time,
00:54:13
Speaker
They're new to their position. It's all scary to just feel like you're sticking your neck out right when you're brand new. Again, not because they thought we were bad people. The leader actually apologized profusely to us, but basically said the gist of it was,
00:54:33
Speaker
This concern has been raised about reputational risk to our unit if we go through with this I agree with that And then there was a little bit of like a pretense of you while it's also we need the right experts on demography Which I to be honest, I think was a bit of a fig leaf, you know And so that's what we're not going for but it wasn't like, you know, well, we think you're fascists or something Right, right. Yeah. Yeah, I guess what I was gonna say is that I think the Well, I guess a few things one is that the
00:55:00
Speaker
The claim of bias, I think, needs to be evaluated systematically. If you have one anecdote and another anecdote, and you're like, isn't that bias? It's like, well, it seems like it. But when we talk about these system-level, societal-level things, I think the empirical data needs to be as big as it can be to speak to those kinds of big issues. Yeah, I would have also handled that. Talk to the organizer. This sounds like an unfortunate fallout and fear
00:55:30
Speaker
from a certain situational feature like this person was new and all sorts of things.
Bias in Educational Materials
00:55:34
Speaker
And then on the education, on the education piece in the book, I haven't read that. But yeah, I mean, I could see that like, if it feels like that is the most read book, is it because
00:55:47
Speaker
People just like that book a lot. Is it because we are indoctrinating our youth with this book and so they're being forced to write it against their will? I don't know. I also just think about the fact that
00:56:00
Speaker
If that book sort of, and again, forgive me that I'm not familiar with it, but if that book is sort of pushing on communist-type ideologies or on dictators, so many thoughts. I feel like we live in a capitalist society. Every aspect of our indoctrination into the country is through a capitalist lens.
00:56:23
Speaker
And like when communism was more of a thing, or there was a, you know, more of a push for it. You know, people were blacklisted. And so like, there's been many instances of like a rejection of that ideology in our society. And I'm not for banning Freire for the record. Yeah. So I guess I'm thinking like, when you're like, is that not bias? I'm like,
00:56:46
Speaker
Well, I mean, what do you mean by that? Because I feel like there's so many... It would be weird for me to be like, is it not biased that people were blacklisted? That'd be a weird question to ask, right? But it would be the inverse of what you're suggesting, I think. So I guess I feel like...
00:57:06
Speaker
Yeah i mean there are i guess what i agree with is that there are definitely camps of people with political views that will like clash. We stand that's like the most trivial obvious thing to say but yeah i like in terms of education right and teaching we also know that like students are impacted more by their peers and they are by the professors and so it's not like professors are.
00:57:26
Speaker
You know having them drink the liberal kool-aid so to speak it's it's you know lots of again students are self selecting into majors that they care about the things like if you go into finance or business or computer science like you're in a totally different environment than if you go into like anthropology or history and so people are self selecting the stuff that interest them based on.
00:57:47
Speaker
their experiences, their family background, all this kind of stuff.
Political Representation in Education
00:57:51
Speaker
And their peers are having a really big influence on them. And it's not like university students are always heavily skewed on the political spectrum one way or the other. Certainly like some universities. At flagship schools, certainly not. Yeah, yeah. Yeah, exactly. In elite private schools in the Northeast, they are a little bit.
00:58:08
Speaker
And NYU might be more liberal and BYU might be more... You could always find... Well, let me call it that a little bit. So young people as a whole...
00:58:19
Speaker
skew left a little bit. It's actually changing a little bit among men recently. Yeah, men and women, yeah. And college educated people skew left also, right? So if you try to look for what are the most, there was a study I read a while ago that was trying to look at what are the most politically representative of the country's student bodies, and it actually tends to be flagship public schools in deep red states.
00:58:45
Speaker
Like University of Arkansas, I think was like the single most representative student body of the country. So it is a little bit skewed, but not because on the whole, not because the colleges are selecting it that way. There's an argument which
00:58:59
Speaker
I've never been on admissions committee in an Ivy league school, so I can't speak to the truth of it or not, but there is a claim that's been made that recently in response to some of the Gaza, you know, Israel protests that that Ivy league schools as part of their DEI projects in the last five to 10 years have been selecting students partly for their track records of activism.
00:59:26
Speaker
and looking for activism in a left-coded way, right? You're not looking for somebody who's like their high school Fed Soc chair or something, right? And so to the extent they're doing that, which again, I don't know. I haven't seen the data. But that's the claim. If that claim were true, I would find it plausible that you would then disproportionately select people, not just with left views, but probably with kind of very left views in those schools.
00:59:49
Speaker
Yeah, I could see that, yeah. And one other point, just following up with something you said earlier, my read of the evidence is absolutely consistent with what you said, that although some professors probably are trying to indoctrinate, we're not very good at it.
01:00:09
Speaker
I think the evidence, I think the jury's still out because people are just starting to look into that. I find it plausible that you might find evidence of indoctrination in K-12 and actually, you know, just the last episode I did was with Heidi Ganal who recently ran for governor as a Republican in Colorado and she's an education advocate.
01:00:30
Speaker
And that's where, you know, she thinks the action is. And again, I haven't, as far as I know, there aren't data, but, but, but my prior would be the same that if you were looking for like liberal influence bias. Okay. So.
01:00:46
Speaker
I think it's important to look at like, what do we mean by indoctrination? Like when I think of K through tub and you're like learning like math and English and it's like, what are the subjects? What subset of the subjects are we talking about that? Because things like American history, right?
01:01:02
Speaker
And it's like, and does indoctrination mean we are not teaching about slavery? I don't think people would say, oh, that's indoctrination if you teach about slavery. Absolutely, no. I mean, maybe occasional wackos, but certainly not the person I was talking to. Yeah, exactly. So then I'm thinking, so what does that mean? That we think critically about market systems, like how capitalism works in this way and socialism works in this other way.
01:01:29
Speaker
various complexities of geopolitical events around the world. So I'm sort of like, well, what do we mean by indoctrination? What are we really indoctrinating anyone with? I'd be curious if there's clear examples of that. Yeah, again, as far as I know, there has not been a systematic analysis, at least that's been made public, because I think I would have seen it and I would have read it with fascination.
Climate Education and Bias
01:01:54
Speaker
The areas where I find it plausible, again,
01:01:58
Speaker
some of the bigger stuff, right?
01:02:04
Speaker
And I also want to clarify, I suspect it strongly varies regionally, and there's probably a indoctrination of many different varieties regionally, right? But if I was sending my kids to, and this is something I thought about because my kids are racially mixed, if I sent my kids to public school in a major liberal city kind of post-2020,
01:02:31
Speaker
I would worry a little bit about them getting a message that the color of your skin will hold you back, even though, objectively, my kids are some of the most privileged people in America. I do worry a little bit that they'll learn a lot about the evils of capitalism, more so than they'll learn about the fact that the only rich societies in the history of the world until 10 minutes ago
01:02:56
Speaker
had markets as the core of their economy. And also the deadliest dictatorships in history were all communist. I worry that my kids in the context of climate change will hear about all the doom and gloom, but won't hear about the fact that death rates and damage rates from natural disasters have been in decline for decades. So those are examples. And in the climate ones, I will say like having taught those facts to undergrads who come out of high school, disproportionately from liberal parts of the country,
01:03:24
Speaker
That does scan. I regularly get students who take my third-year climate macroeconomics class, or environmental macroeconomics class, who are shocked by those data points and will say, thank you for helping me feel hopeful again. That's not everybody, but it's a noticeable fraction. And certainly, I've never had a student come up to me and say, you're the first person who told me that global warming is real. So that's where I might apply more.
01:03:53
Speaker
But do you think that that comes from education and height? Like I didn't take climate change in high school. Like do you think that comes from the media or do you think that comes from educators in high school, like high school? That's a great question. Right. You know, and so because I could totally see students. That's what I'm thinking is like a lot of these topics, like they get this stuff from the media, from their parents, from just sort of like, right. And less so from like their bio teacher who's trying to teach them about like what mitochondria is and like not about like the doom and gloom.
01:04:23
Speaker
I like it would be if you took a climate change class in high school.
01:04:29
Speaker
in a Northeastern urban area, maybe that class would be more focused on kind of, hey, this is a really scary thing, we need to, so maybe, but how many students that come through in your class have taken those classes and that's where they've learned that versus the media or parents? I mean, it's a great question. I'm sure they've seen a very skewed media, because that's something I pay attention to.
01:04:58
Speaker
There definitely is climate in high schools and thinking most of Colorado now, uh, certainly in kind of the bluer parts. Uh, I mean, on top of that, the ones who were taking my third year class in environmental studies, who haven't seen that also haven't seen it in their first and second year classes, right? Which is kind of on our, on our, our, uh, profession. So yeah, I take, I guess the short answer is I don't know, but I take your point that it's, that it's, it's, it's probably a much more complicated thing. Okay. Before we wrap up, I want to.
01:05:24
Speaker
I want to ask you one short line of questioning, and that is one of the things that I think is fascinating about the topic of political bias and political discrimination to the extent that it exists against conservatives.
Diversity and Historical Injustices
01:05:40
Speaker
is there are actually quite a lot of parallels to talking about other types of diversity and other types of discrimination, except that you tend to have people sort of switching sides, right? Now, one thing that I'll note off the bat, which I'm sure you'll call me on if I don't, is that an important difference between those two contexts is the historical injustice piece, right? We didn't have,
01:06:04
Speaker
Jim Crow with conservatives and liberals. We didn't have slavery with conservatives and liberals. So when you're talking about the justice justifications for things like affirmative action or watching out for discrimination, etc., clearly those cases are different. However, partly I think because of the jurisprudence on this, at least publicly, what we're more often talking about in both cases is the strength of the scientific enterprise.
01:06:31
Speaker
And so, you know, conservatives are saying, if conservatives are so severely underrepresented, then we're missing really important questions and we're missing really important perspectives and we're not able to relate to, you know, conservative students who are, you know, underrepresented in higher ed. And people say the same things, right, about women in male-dominated professions, you know, or about minorities in kind of white-dominated fields, which, you know, unfortunately still most fields,
01:07:00
Speaker
So I guess the, I'm not gonna ask you to kind of comment on that in general, but let's dig into your results, right? If we found, it seems like your results imply that we may not need to be as worried as we think about political skew in the demographics of professors, which for the record, I think there's pretty good evidence that it's,
01:07:26
Speaker
driven both partly by discrimination and also partly driven by choices. Big five trade openness correlating with liberalism and wanting to be an academic. Suppose that somebody did a study exactly like yours and replace liberal and conservative with white and non-white or male and female and found the exact same answer that you found.
01:07:50
Speaker
What if anything would be the implications of that for how we should think about diversity and sort of to what extent lack of Ethnic or gender diversity is an issue and maybe pick ethnic because I believe actually women are overrepresented in your field So like like like
01:08:09
Speaker
As somebody who's concerned about, say, ethnic diversity in social psychology, how would that finding affect your perception of that problem? And then how would you translate whatever your answer to that question is to how conservatives should think about how your findings relate to the problem they care about?
01:08:30
Speaker
Yeah. Yeah. I think the historical piece, like you said, is a very important one. And I think the idea that like, you know, what do we want from a society? What are kind of our values comes in a lot, I think, in part from that kind of historical piece of like who has been excluded from these spaces. Even though my field, for example, is heavily dominated by women, if you look at like senior faculty, if you look at like deans or presidents of universities, right? Like the higher a CEO, whatever,
01:08:59
Speaker
industry or discipline, you pick the higher up you go, the more homogenous it gets. It's usually just men, usually just white men. I think the historical piece that you preempted is super important because I think the values that people have on these affirmative action type policies or
01:09:19
Speaker
or related in academia, for example, stem from that. And then in terms of what does that mean? What would be best for the scientific enterprise in terms of creative ideas and inclusion of people? I think it's important to include lots of different ideas and have different viewpoint diversity. I think sometimes people get caught up, right?
01:09:49
Speaker
you know, if I were to say viewpoint diversity in one crowd, I could imagine some people being like, Oh, that's conservative coded for he wants an alt right
Diverse Viewpoints in Academia
01:09:56
Speaker
person. And it's like, I don't actually think that's what
01:09:59
Speaker
we're talking about. So I think it is important to have people who have different theories and different ideas. And this goes back to, I think, our original question of where do you draw these lines of what questions fall within this range of let's think about the scientific interesting questions. And we should have people who think about those in different ways. And we should try and all do rigorous science so that we actually know what's going on and get good data for it.
01:10:29
Speaker
So I think it's a complicated question because again, it's like, it's something where I feel like the idea of like creating, you know,
01:10:45
Speaker
if you were to say you're trying to kind of call out this, not hypocrisy, but this sort of like asymmetry where you would say, hey, look, you know, we could, we have these like DEI type positions that people are hiring for sometimes more coded, sometimes less coded. And, and like, you know, conservatives are underrepresented, you know, why don't we have like specific conservative positions? Like we have to devote, you know, 30% of our department to be, or 50% of our department to be conservative. Why don't we have that, right? Um,
01:11:13
Speaker
And I think this goes to the idea that like, and actually like you acknowledge like, no, I don't think anyone expects everything to be like totally equal, right? Like 50-50, right? Like there's gonna be, it's within bias I think is not when things are like anecdotally off for margins of this and that. It's like systematically skewed, right? So like the idea of like, if you looked at the C-suite or university presidents or tenured faculty, right, in academia,
01:11:41
Speaker
It would be like very very lopsided. And there are mathematical complexities there actually that are interesting. So just a really quick sidebar.
01:11:49
Speaker
I think the best bellwether for trying to understand what null hypotheses should be is sports where it's really easy to measure merit or other kinds of things like chess. What you do tend to find, which actually if you dig into the statistical mechanics is totally what you would expect to find, is the more rarefied the sample is, the more your null expectation would be actually a large skew in the demographics in one direction or another.
01:12:18
Speaker
So, for example, I think it's the case that 498 of the all-time fastest 500-meter sprint times are from men with West African ancestry. I have no idea why that is, but I'm sure it's not that the clocks are biased against white people.
01:12:39
Speaker
I think Jewish people are something like half of the world chess champions through history and 0.2% of the global population. So when you do get to rarefied samples, and there's actually one reason why I personally think like, let's just not, we pay I think more attention than we should to like the C-suite and the president and not enough attention to like the entry level, right? Partly for that reason. But anyway, sorry, you were saying.
01:13:06
Speaker
Well, yeah, so now I was saying that I think that that the the historical piece for me is like, I think, just like a very important consideration, because it's like you said, like conservatives have not faced the same Jim Crow slavery type of exclusionary practices from society and this case, academia, that that other folks have. So
01:13:30
Speaker
I think like so I guess I try and hold both I like both recognize that these are not this is an apples to oranges comparison in a way I recognize the like veiled hypocrisy in in the principles of it and I also I think it's important to like have good diverse viewpoints on being at universities, but so I you know, I I think
01:13:54
Speaker
I think people should try and be mindful. I think we need to develop systems to keep our biases in check to try and think rigorously about the ideas that people are studying that we think will be, like you said, supportive for other people who want these ideas to come through and to ensure that we are pursuing a productive scientific enterprise through multiple different angles of that.
01:14:21
Speaker
So I think I try and hold all of those things at the same time, which is not easy because they sometimes come in conflict with each other. Yeah. And so maybe two tiny follow-ups and then we should wrap up. Sure. The first is, so I'm hearing you right. It sounds like you're saying the diversity case
01:14:41
Speaker
for a drawing from a broad pool is strong in both cases. There's a justice case in one, in the case of minorities, particularly African Americans, and not in the case of conservatives. And so for that reason, the case of affirmative action is stronger for minorities than for conservatives. Am I characterizing that correct? Yeah, that's right. Okay, so then my second question is,
01:15:09
Speaker
What, if anything, should we do about discrimination against conservatives?
Addressing Conservative Discrimination
01:15:17
Speaker
Which again, I believe there's strong evidence that that exists, including from that social psychology paper that you mentioned by Joe Duarte.
01:15:27
Speaker
So should that be, so for example, at CU Boulder, I think to our credit, we have a policy that says that it includes political affiliation and political philosophy in the protected categories in our discrimination policy.
01:15:40
Speaker
Um, there isn't in practice, you know, affirmative action. I mean, there's this one sabbatical position, right? Which in a sense is, is, is conservative affirmative action. But you know, you're never, if you, if you put in a proposal for a conservative as your, you know, DEI hire, right? You'd be dead on a rifle. Uh, so, so, so what do you think the policy should be and kind of, and, and kind of how should it be policed? Cause, cause some people I think would say,
01:16:07
Speaker
Let me give you two examples to illustrate what I mean by that. In the case of two identical academics, and one just has on his CV that he worked for John McCain, and the other one has on the CV that he works for Obama, I think most people would agree that you shouldn't discriminate against the person who worked for McCain. If they're kind of otherwise identical, and if there was some way to measure that, then that would be wrong, and that would certainly be punishable, at least in theory.
01:16:38
Speaker
I think where the pushback might come from some liberals is, you know, when does a we don't discriminate against conservatives policy become a, you know, we have to let climate deniers be on our geoscience faculty policy? Right. Personally, I think it's a little bit of a mott and Bailey, but kind of what would be your answer to that? And kind of where would you draw the line?
01:16:59
Speaker
Yeah, I think part of this is there's so many domains in science and academia where this, I don't want to call it gatekeeping, but where these junctures occur, whether you're at a conference and you're sending an abstract to give a talk or a poster, whether you're sending a paper to get published in a journal, whether you're applying to a job for a postdoc or for a faculty position. There's so many of these junctures where there's a review process that's happening.
01:17:24
Speaker
and a selection process. And the question is, is that selection process biased? How do we keep it unbiased? And I think there is clearly no silver bullet. If I had the answer, it would have already been solved because someone else smart would have thought of it. But I think that the
01:17:41
Speaker
I think it's important to have that listed on the policy documents. I think it's important to ensure that you have diverse reviewers and thinking about the camps that people fall into. I think about this even along theoretical lines. Totally. When you're picking reviewers,
01:18:00
Speaker
Should you be picking reviewers who kind of all fall in this theoretical camp, or should you try and get a diverse, you know, and it's the same dimension, it's a different dimension, but the same concept of like, let's get, you know, different theoretical takes on this, nevermind, just like politics. But
01:18:14
Speaker
So I think that's important whether you're reviewing conference papers or abstracts or journals or hiring decisions. And in terms of policing that, I think a lot of it is probably just norm shifting. I don't think it's easy to police. When I think about departments, I don't know how it works in econ or in ecology, but you have this review search committee of four faculty members and one person leads the search and then they're trying to figure it out with the rest of the department, but there's internal politics in the department.
01:18:42
Speaker
The chair and the Dean and sure yeah, it's like it's right. And so how do you police the what's the process for that review? You know, I think we have to fall back on the things we know about like I'm trying to blind ourselves from from some of the relevant information that we might want to bias ourselves but but you also at the same time have to know that as we do that that information isn't always a
01:19:06
Speaker
something that is invisible, right? Like if we're talking about like the theoretical ideas, like you're gonna read someone's work and know kind of which camp they're from. Yeah, so I think that I think what I was gonna say around the norm setting piece was just that like, I think that people fear the extremes, right? And even though you were you were sort of
01:19:25
Speaker
like joking that like, you know, of course, this is like, this is not really what we mean. But I think that's that's actually, I think, important, because I think a lot of people fear that they fear, like, what you mean is that you're going to let in people who think the earth is flat. And we should include people who are conspiracy theorists, because it's a diversity of ideas. And it's like, well, no, that's a terrible idea. Yeah. So
01:19:47
Speaker
Then I think having these kinds of conversations, being transparent about what are we actually talking about? If you're talking about studying different market structures or whether our climate estimates have been overstated in terms of how quickly we'll reach a certain global
01:20:08
Speaker
rise in temperatures and then like the those effects like yeah like let's let's talk about how how good or not good those estimates are and you know in your field or and why not so i think
01:20:22
Speaker
Tabbing these conversations, norm setting within your own field of like, what is the range of things we're talking about and then kind of drawing those lines. And then you're asking me like, where's the line get drawn? I mean, depends on the field. But like, if someone, I can even think of a colleague I have who I know is conservative and whose research is fairly conservative, not coded, but
01:20:45
Speaker
would be rated that way, I suppose. And that's fine. I don't know. To me, within reasonable realm of something they're interested in studying, they always say, research is research. And so fine.
01:20:59
Speaker
Right. I don't want a flat earther in the environmental studies department or like anywhere in academia. And yeah, I think we'd have to like maybe go through and like list out the topics that seem like pretty either scientifically void from, you know, the discussion because it's like just it's
01:21:22
Speaker
it's implausible or it's not something that societally we value and think about where to draw that line. It's tricky to think about all the various different disciplines and where to draw that line. Would it be fair to say as maybe a final thought that as long as we have norms of curiosity and rigor,
01:21:45
Speaker
And therefore we're interested, you know, our first our first thought is when we see something that is contrasting with our prior is like, oh, that's interesting. Right. But then we chase whatever everything down rigorously. My guess would be that.
01:22:02
Speaker
And I think this is kind of what you're, maybe what you're getting at too, is that that actually solves, I don't think it solves the whole problem, but it solves a larger fraction of the problem that people think. And I think your study is one data point in support of that. And also, as you alluded to, my experience, which I have to say has been overwhelmingly positive of
01:22:22
Speaker
publishing cold water type research about climate scenarios is also, I think, a data point on that. There are a couple of people who didn't like it, but the vast majority of people have responded with curiosity and rigor and have welcomed it. Would you agree with that? Yeah, I would definitely share that sentiment. I think curiosity and rigor is a very important tenet to hold going forward.
01:22:48
Speaker
Great. Well, that's a good positive note to end on. Dave Renero, thanks so much for this fascinating and far reaching conversation and we'll see you soon. Thanks so much for having me. It's great to be here. The Free Mind podcast is produced by the Benson Center for the Study of Western Civilization at the University of Colorado Boulder. You can email us feedback at freemind at colorado.edu or visit us online at colorado.edu slash center slash Benson.
01:23:18
Speaker
You can also find us on social media. Our Twitter, LinkedIn, and YouTube accounts are all at Benson Center. Our Instagram is at TheBensonCenter. And the Facebook is at Bruce D. Benson Center.