Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Episode 433 - Neil Levy's "Bad Beliefs" (Part 1) image

Episode 433 - Neil Levy's "Bad Beliefs" (Part 1)

S1 E433 ยท The Podcasterโ€™s Guide to the Conspiracy
Avatar
515 Plays7 months ago

Josh and M comment on the first two chapters (of six) of Neil Levy's book "Bad Beliefs." Does Neil have bad beliefs about bad beliefs, and do Josh and M have bad beliefs about Neil's supposed bad beliefs which are to be be found in the book "Bad Beliefs"? Listen and find out!

Recommended
Transcript

Introducing 'Bad Beliefs' by Neil Levy

00:00:00
Speaker
This week will be our first episode talking about Neil Levy's bad beliefs. And just to avoid any who's on first-style chicanery, I need to make it clear that Bad Beliefs is the title of a book written by Neil Levy. Precisely, although... What are we talking about? Quite, but... As opposed to any bad beliefs that Neil Levy himself may hold, things that could reasonably be called Neil Levy's bad beliefs. These are not, again not, what we're going to be talking about.
00:00:28
Speaker
Thank you for clearing that up. Unfortunately, since we don't actually agree with some of the things that Neil sees in his book, it turns out we will be discussing some of Neil Levy's bad beliefs, while discussing Neil Levy's bad beliefs. So, Neil Levy's bad beliefs, while not being specifically about Neil Levy's bad beliefs, does contain Neil Levy's bad beliefs. So is this exactly the kind of thing I was trying to avoid? How's anyone expected to understand this?
00:00:53
Speaker
Look, it's perfectly simple. When we're talking about the book, Bad Beliefs, Bad Beliefs is capitalized. And when we're talking about the beliefs, Bad Beliefs is all in lowercase. Ah, I see you've forgotten that we're a podcast, again. Ah, that would explain the microphones. Okay, how about this? When we mean the book, we'll proceed it with this. And when we mean the beliefs, we'll proceed it with this.
00:01:30
Speaker
So... Bad beliefs versus... Bad beliefs. Yes, to be sure, we should probably use it to indicate the capitalisation of every word we say. Every... Word?
00:01:54
Speaker
Yes. Joshua.

Host Introductions: Josh Addison and Dr. Em Denton

00:02:03
Speaker
No. Roll. The...
00:02:21
Speaker
The Podcaster's Guide to the Conspiracy featuring Josh Addison and Em Dentist. Hello and welcome to the Podcaster's Guide to the Conspiracy in Auckland, New Zealand. I am Josh Addison and in Zhuhai, China, what if you discovered the most dangerous man in the universe was you, Dr. Em, our ex-Dentist? Then I would have to go back in time and kill myself.
00:02:49
Speaker
purely because my son is watching Jet Li in the one as we speak, as all good children should.

Pop Culture Dive: 'The One' Movie Discussion

00:02:56
Speaker
Oh, the one. I rewatch that. Very, very 2001 film. Very 2001. I remembered the drowning pool, but I'd forgotten about the disturbed in the soundtrack. My son actually said, why do they keep going?
00:03:12
Speaker
Isn't there only 27 alternate realities in that film? There's this very limited sense of the multiverse. You won't believe how your choices can affect reality. There are 124 different realities out there. Wow, that actually indicates that choice doesn't really mean much at all.
00:03:35
Speaker
No, no, which is an interesting philosophical position, which is relevant to what we'll be talking about shortly, but first, because I have to remember this every time, what have you been up to academically speaking?

Challenges in Education: Grading in the AI Era

00:03:46
Speaker
Actually, not much for the last fortnight, because I've been marking student assignments. And it turns out that marking students assignment is both a lengthy process,
00:03:58
Speaker
and a soul destroying process. Not necessarily because the assignments are bad, but because in the age of the LLM, you read every single assignment with the suspicion that it's been written by a computer. So you are just second-guessing yourself whenever you give a grade as to, I mean, is it vague and badly written because it was written by an AI that doesn't write good philosophical content, or is it
00:04:27
Speaker
vague and badly written because it's written by a first year student whose first language is not English. Very difficult. So that is what I've spent the last week and a bit doing. And it's been soul destroying, absolutely soul destroying. And the fact that I've still got more assessment stuff to do later on this semester, it fills me with dread, which is the only only emotional state I feel these days.
00:04:53
Speaker
Well, good. Good. Then you're in the right frame of mind. It's true. Because we're getting into another book. Another book we haven't... Ah, yes. Well, no, Josh, I've been mean to bring this up, but you actually died at the beginning of this year. And this is Purgatory. I'm actually not Dr. M.R. Extended. I'm actually the devil. You are the devil. I'm punishing you.
00:05:14
Speaker
by making you read books as we as we process what we're going to do with your soul eventually because Josh to be frank you've lived a very neutral life not particularly good but not particularly bad and so we really don't know where to send you so basically we're forcing you to read books as trying to work out what kind of punishment you should go through because everyone lives an exceptional life you don't get to go to heaven
00:05:40
Speaker
But this might be it. This might be the rest of eternity. Endless book reviews. Oh, I guess I'd better get used to it then. Shall we play a tune? Don't start enjoying it though. If you start enjoying it, it's going to really muck up our system. Yeah, I want to make a good place reference, but frankly, you are no Ted Danson. But who is? I'm more like a Ted Bundy. Anyway, play a tune. Welcome to Conspiracy Theory.

Delving into 'Bad Beliefs': Themes and Author Background

00:06:16
Speaker
So, the book that we are talking about this week, next week, and probably the week after, well, sorry, this episode, next episode, and probably the episode after that, is from a two-weekly schedule, is Bad Beliefs, Why They Happen to Good People by Neil Levy. This is a 2022 book published by Oxford University Press.
00:06:38
Speaker
Now, when we were looking at the Michael Shermer book, my first question was, who is Michael Shermer and why do we care what he thinks? I'd say the same thing about Neil Levy, but of course, Neil Levy is no, well, Neil Levy possibly is a stranger to this podcast, but this podcast is no stranger to Neil Levy.
00:06:54
Speaker
We have looked at his papers. We've looked at papers that were reacting to his papers, as I recall. In particular, we looked in 2020. We looked at his paper from 2007 called Radically Socialised Knowledge and Conspiracy Theories. So he's someone in this field, but what more do we need to know about him? So he's actually quite a famous philosopher. He's based at Macquarie in Sydney, although he also works for the Herro Institute in Oxford as well. He's Australian.
00:07:25
Speaker
both by place of work and also I believe by place of birth. Or that being said, now I'm going, is he Australian or does he just sound like an Australian? I'm actually going to be, I'm going to be agnostic. Neil Levy is an Australian philosopher who may or may not be Australian. I actually don't know. He's the Russell Crowe philosophy.
00:07:44
Speaker
Oh, actually, that would be interesting if you turned out to be a New Zealander who simply lives in Australia. Anyway, Australians are weird, Australian philosophers even more so. He's an advocate of a position that I happen to think ends up being incoherent, but we're going to see a lot about this position in this particular book.
00:08:01
Speaker
which is that he thinks we're right to be dismissive of conspiracy theories, because conspiracy theories go against the opinions of properly constituted experts, even though people like myself, Charles Pigdon, Steve Clark, Brian L. Keeley, Lee Basham, and the like have all kind of criticized Neil's position. Neil's position is in fact quite popular, so Keith Harris, for example, has resurrected a version of Neil Levy's argument in recent papers,

Questioning Conspiracy Theories: Levy's Views

00:08:31
Speaker
And friend of the show, and friend of mine, Joe Yusinski loves that distinction, to say, look, the reason why we should be suspicious of conspiracy theories, even though it doesn't tell us that they're necessarily false, is that experts tend to disagree with the things we call conspiracy theories.
00:08:47
Speaker
I just don't think you can cash out the idea of properly constituted experts in a way which ends up being coherent. But this is a point he's going to try to argue for in this book. So when we get to that particular argument, we can deal with whether this version works or not.
00:09:05
Speaker
What is interesting is that this book was written during the pandemic and in the acknowledgement of the book he admits that it could have been a very different book indeed. He says 18 months ago I started writing a very different book. I'm assuming he was going to write some kind of
00:09:22
Speaker
pornographic masterpiece but ends up deciding to write philosophy instead. That book, which drew heavily on previously published work of mine, explained bad belief formation as the product of the ways in which we fall short of rationality. During the months of lockdown, first in the UK and then in Australia, I came to a very different view. According to the account I now defend, bad beliefs are produced by rational agents responding appropriately to evidence. So he started out
00:09:52
Speaker
writing a book on bad beliefs being due to people being irrational. During the process of working on the book, he came to the opinion that actually many bad beliefs, not all, but many bad beliefs are actually obtained rationally and people are responding to evidence. It's just that people live in an environment where there is a paucity of good evidence that they have access to.
00:10:20
Speaker
So, the book is in six chapters. We're going to look at the first two today, although there's also a preface and also some concluding thoughts afterwards. Yeah, and the preface is quite long, but I've drawn some sections out of it that basically act as an abstract for the book. Now, Josh, we always have a competition as to who's going to read the abstract. Do you want to read the abstract, or shall I? No, please, be my guest.
00:10:45
Speaker
OK, so this is drawn from the preface, which is about 12 pages in length. So I've just drawn some snippets, which I think give a idea of how the book is going to go. So this is a book about beliefs, good and bad, about how they're generated and how they might best be improved. My concern isn't with the analysis of knowledge. Rather, it is with how knowledge is acquired and what factors lead to good and bad beliefs. Correlatively,
00:11:14
Speaker
My exemplars of belief will not be the uncontroversial cases that feature in a great deal of contemporary discussion. Cases in which, say, one agent believes that another owns a car of a particular make
00:11:25
Speaker
or that there's a barn thereabouts. Instead, my exemplars will be cases that are controversial but shouldn't be, beliefs about anthropogenic climate change, evolution, and the safety and efficacy of vaccines. These examples are chosen because there's an expert consensus on these issues, but many people reject the expert view. Are they rational in doing so? What explains their descent? Should we attempt to change their minds?

Understanding Bad Beliefs: Definitions and Critiques

00:11:51
Speaker
And if so, how should we do it?
00:11:54
Speaker
I'll defend some controversial answers to these questions about controversial beliefs. A bad belief in my sense is a belief that A, conflicts with beliefs held by the relevant epistemic authorities and B, held despite the widespread public availability either of the evidence that supports more accurate beliefs
00:12:14
Speaker
or of the knowledge that the relevant authorities believe as they do. The relevant epistemic authorities are those people and institutions that are widely recognized as being in the best position to answer questions in the domain. Scientists are the relevant epistemic authorities when it comes to evolution, historians the relevant epistemic authorities on the holocaust, and so on.
00:12:36
Speaker
I'll suggest that bad beliefs tend to arise very significantly at any rate through the rational reasoning processes of those who end up believing badly. Just as I might find myself lost because someone tampered with the inputs into my navigation, so people end up believing badly because their epistemic environment has been manipulated. I'll argue that these bad beliefs are not at odds with the higher order evidence. Higher order evidence is evidence that concerns not the issues,
00:13:06
Speaker
about which we're trying to make up our minds, but the reliability of the first order evidence and how other people are responding to that evidence.
00:13:16
Speaker
Higher order evidence is genuine evidence, and we rely on it all the time. But philosophers and psychologists overlook its pervasiveness and its significance. Once we come to see the ubiquity of higher order evidence and the extent to which cognition is reliant on it, we'll be re-forced to think the extent of irrationality in human reasoning.
00:13:42
Speaker
Now I suddenly think maybe I should have actually read that because we get into chapter one which is the one that you looked at so you're probably going to be doing most of the talking now.
00:13:52
Speaker
It's true. So, you know, you've made bad decisions. Poor choices. And those bad decisions are the results of the beliefs you have. As we're going to find out in this chapter, chapter one, what should we believe about belief? And you just know you're dealing with a philosophical text. Let's talk about believing about beliefs. Let's talk about the beliefiness of belief.
00:14:14
Speaker
Very philosophical. So in chapter one, Neil gives us an account of belief and belief in us. Now I should point out because Neil is at an Australian university and the kind of system we have both in Australia and Aotearoa is that we tend to refer to people by their first name. I'll be talking about Neil's work as opposed to Levi's work. I realize that in say the States
00:14:38
Speaker
would go that would seem rude to talk about people by their first name, but in this kind of situation we tend to refer to each other by on a first name basis. So this is a chapter in which Neil is going to give us his account of belief and his account of belief in us.
00:14:57
Speaker
but it's an argument by a thousand cuts. So he's going to establish his position by critiquing positions found elsewhere, and by showing that those positions don't account for the kind of examples he's interested in, and thus by carving away at rival accounts,
00:15:17
Speaker
he's going to use that as the establishment of his own particular view. So as you get to the end of this chapter, not really having a firm grip on exactly what Neil wants to argue for, but we do have reasons to think that competing accounts don't account for the kind of thing that Neil is interested in.
00:15:38
Speaker
Now a lot of this chapter borrows from cognitive science, and I think this ends up being both a blessing and a curse for Neil's account.
00:15:49
Speaker
Neil's going to point out there are lots of promising theories in cognitive science about how beliefs work and what beliefs entail and what beliefs about beliefs entail. And he's also going to caution that we need to treat a lot of this evidence with skepticism given the systemic issues around how the data that supports these hypotheses is generated.
00:16:13
Speaker
Now this is going to be a problem for his account because effectively he dismisses evidence of views when it doesn't suit his own theory but also at the same time doesn't seem to apply the same kind of caution he advises to those other theories when there's evidence that seemingly supports his theory. I think this comes to the fore particularly in the chapter you'll be looking at Josh where he gives his
00:16:39
Speaker
cod evolutionary story about humans being social creatures with cultural knowledge. But we'll get to your chapter in due time. And the other problem, and I think this is a general problem for the book,
00:16:54
Speaker
is the kind of problem that, say, Quasim Qasam has when his account of conspiracy theories is being contrarium. The story Neil tells about bad beliefs seems to be about beliefs that he, and under the assumption his audience, thinks are bad. So Neil says, look, lots of these people have bad beliefs, they're not connected to reality,
00:17:18
Speaker
we're not given much of an argument to say that they are bad beliefs, other than the fact they go against the scientific consensus of the day, which is to say that according to Neil they're bad beliefs, but it doesn't necessarily tell us that they are bad beliefs in general, it just tells us that they are Neil's bad beliefs, but not Neil's bad beliefs, just the bad beliefs according to Neil. Right, it's perfectly clear.
00:17:44
Speaker
No it's not, and it's probably not going to get clearer as we go through this. But he does give us an account of what he thinks beliefs are doing. So to quote, the characteristic functional role of beliefs provides us with a heuristic for belief attribution. Beliefs are those states that make best sense of agents' behavior given plausible assumptions about their desires.
00:18:05
Speaker
This fact also brings into belief why it is so important to understand the factors that explain belief acquisition and update. Because beliefs play a pivotal role in behaviour.
00:18:18
Speaker
Accurate beliefs are essential for appropriate behaviour. Now he goes on to say beliefs and especially bad beliefs matter, not only for our banal behaviours, they're also at the heart of our major political challenges. Those who seek to manipulate our behaviour have been quick to grasp this fact,
00:18:37
Speaker
They aim to manipulate us by targeting our beliefs, rather than getting us to act directly. And this is where he brings in On Rescue's and Conway's book, The Merchant of Doubt, that looked at the way the fossil fuel industry sought to muddy the waters by making it look as if the science wasn't settled with respect to global heating, aka anthropogenic climate change,
00:19:02
Speaker
Even though by the 1960s the science was very definitely settled, humans were changing the environment due to the industrial scale of pollution that we are producing on a day by day, day by day basis. And so as he points out, the merchant of doubt don't aim to convince the public that climate change isn't genuine or isn't a serious challenge.
00:19:27
Speaker
Rather, they aim to convince us that the science isn't settled, that there is an ongoing debate about these claims, and that there are reasonable views on both sides.
00:19:39
Speaker
much like the tobacco industry, which I think set the template for that sort of thing. Well, I mean, I mean, the merchant of doubt basically is a story about how the tobacco industry basically and so the tobacco industry and the petroleum industry have basically engaged in exactly the same tactics.
00:20:00
Speaker
And it turns out some of the people involved in denying the link between smoking and lung cancer or just smoking and general respiratory disease are some of the same names we find doing work on denying the connection between industrial pollution and the climate changing.
00:20:18
Speaker
That doesn't really surprise me at all. No, no, you might actually think it's a conspiracy. And so when he's talking about the merchants of doubt, this is where we get our first real claim about conspiracy theories, which explains why we're looking at this book with respect to a podcast on the philosophy of conspiracy theories. So
00:20:37
Speaker
To quote Neil once again, in their book-length study of the right media, Benkler, Farris and Roberts argue that the promotion of bizarre conspiracy theories by Alex Jones and the like should be understood in the same kind of way, aka, merchant of doubt, trying to create the idea there's reasonable debate on both sides.
00:20:55
Speaker
It's not because they seriously entertain the thought that the Sandy Hook shooting was a false flag operation, that the right-wing media ecosystem has devoted attention to the claim. Rather, the aim, or the explanation, people need not always explicitly aim at an end to effectively pursue it, is disorientation.
00:21:15
Speaker
They aim simply to create a profound disorientation and disconnect from any sense that there is anyone who actually knows the truth. Left with nothing but this anomic disorientation, audiences can no longer tell truth from fiction even if they want to. I mean, there's something to that, but it's very sweeping.
00:21:37
Speaker
Yeah, so I think this is, and this is a terrible thing to say, I think this is uncharitable to Alex Jones. Because the kind of story he's telling here is that look, you can't really tell me Alex Jones believes the kind of things he says. There must be another motivation for the kind of weird or bizarre beliefs that Alex Jones proposes.
00:22:00
Speaker
And I think that's more an indication that people like Neil go, there's no way these people can be sincerely telling what they really believe. They have to have some other motivation. So on one level, I think this is an unfair characterization because it's assuming that if someone is promoting bad beliefs, then they are in some sense knowingly promoting beliefs that they know are bad. Knowingly promoting bad beliefs that they know are bad is a tautology, I could have just say,
00:22:29
Speaker
they are knowingly promoting bad beliefs. But the other thing is that I think this characterisation actually goes against the kind of argument that Neil's going to present, because it seems to be at odds with the thesis of the book. Maybe people like Alex Jones and David Icke have higher order evidence
00:22:51
Speaker
that makes their bad beliefs rational to them. Why are we assuming that just because the kind of beliefs that they put forward so doubt doesn't mean that they are in some sense sincerely held? And I know on this podcast we have talked about Alex Jones and we've talked about the fact that Alex Jones is motivated to promote particular views because there's a financial grift from selling vitamin supplement penis enhancing pills and the like.
00:23:19
Speaker
And so sure, in some cases you might go, look, Alex Jones is playing to the gallery here to sell his product, but I don't think we can just make the assumption that these people are part of a conspiracy to sew discord. They may well sincerely believe these things and also be sewing discord and also selling things on the side. Yeah, yeah, I think there's more nuance to it than the way he makes things appear.
00:23:48
Speaker
And I think this is the other problem I have with this book in general. And this is a problem I have with a particular style of philosophy. This is a grand theory of belief. Neil's trying to come up with why do people have bad beliefs? The grand theory to explain why there are bad beliefs in the world and why it turns out actually they might be rational. And I'm always just a little skeptical of grand theories because they do have to make
00:24:16
Speaker
quite a lot of generalizations to get there. And maybe I'm just suspicious about generalizations and the way that they end up mischaracterizing certain positions, but it is a suspicion I have that if you have to generalize like this to get to your end product, maybe your end product isn't as great as you think it is. But that's my own suspicion about grand theories.
00:24:46
Speaker
Ralph, continue. OK, so we then get a new section. Should we believe in belief? And this is where Neil starts looking at reports of startling ignorance or of bizarre beliefs that are staple fear for the media. And so he points out that when you start polling people,
00:25:06
Speaker
you find that there is, according to him, a large proportion of our fellow citizens who are fundamentally disconnected from reality. And to illustrate this, he gives a version of the Petergate story, which he claims illustrates that not everyone who promotes bizarre conspiracy theories genuinely believes them.
00:25:28
Speaker
And I found this accounting of Pizzagate to be a little weird. It gives a version of Pizzagate to kind of illustrate this idea that our fellow citizens are fundamentally disconnected from reality. And the version of Pizzagate he gives, he then claims, illustrates that not everyone who promotes bizarre conspiracy theories genuinely believes them. And I think the setup here is wrong. So Neil points out that Pizzagate started as a joke. So the Podesta emails get leaked, people
00:25:57
Speaker
go, it's really odd, John Podesta sends a lot of emails about pizza, and the other people in the committee respond with their love of pizza as well.

Origins of Pizzagate: A Humorous Start

00:26:08
Speaker
Wouldn't it be funny if it turned out the pizza thing is a code of some kind? And then some people started to treat that claim seriously, and the Pizzagate conspiracy theory emerged from it.
00:26:20
Speaker
Now this setup, this genealogy of Pizzagate, doesn't tell us that not everyone who promotes bizarre conspiracy theories generally believes them, because the people who were making the claim, wouldn't it be funny if there was a code in the John Podesta emails, are not actually generating a conspiracy theory in the sense that Neil is talking about it,
00:26:44
Speaker
they are making a joke, they're engaging in satire or comedy. It's the fact that some people interpreted that satire or comedy as a conspiracy theory which would be the issue here. So his example doesn't tell us that not everyone who promotes bizarre conspiracy theories genuinely believes them because his example isn't about people promoting a conspiracy theory in the first instance, it's about people making a joke about the Podesta emails
00:27:13
Speaker
and that joke being treated seriously by the people who then generated the conspiracy theory that they then acted upon. Yeah, there's then, I'm pretty sure this is going to come up in just a minute, there was the other argument that Sherma also made, which is that not many people must have truly believed in it because hardly anybody acted on it, which is another thing altogether, so yeah.
00:27:44
Speaker
And here's a take to say that he's wrong, but I don't think he's proved what he's saying.
00:27:50
Speaker
Yeah, that's the thing. The example isn't a good illustration of the point he wants to make. Because I mean, I agree, not everyone who promotes bizarre conspiracy theories generally believes them. As I said before, we have questions about some of the conspiracy theories that Alex Jones has promoted as to whether he seriously does believe them. But this particular example doesn't get us to that conclusion.
00:28:15
Speaker
because the genealogy he tells isn't about the promotion of a bizarre conspiracy theory, it's a joke being misinterpreted. The other thing is, he doesn't give us a base rate for bad beliefs generally, and this seems a little troubling. Because he's going to be all about evidence throughout this chapter, he's going to look at evidence for how beliefs work, how beliefs don't work, how beliefs lead to particular action.
00:28:43
Speaker
But what he doesn't point out is that in particular in the case of belief in conspiracy theories, and that conspiracy theories defined as bad beliefs when you look at the polling in psychology and political science, those bad beliefs are trending down.
00:29:01
Speaker
So he gauges in a bit of a panic about conspiracy theories, indicating it's a widespread problem, when actually, if he was going to actually rely on evidence here about the number of bad beliefs people have, he's going, well look, actually the good thing is,
00:29:17
Speaker
It turns out belief in conspiracy theories, as defined as bad beliefs, has been trending down since the 1960s, and we know he's read his Joe Ucinski because Ucinski and Parent are quoted in this chapter
00:29:34
Speaker
So he's quoting American conspiracy theories and that book makes the claim that Joe makes now with respect to the Pew Research stuff, belief is going down, it's not going up. And indeed the other thing, this really comes to the fore when he starts talking about fake news in this section.
00:29:50
Speaker
He says, fake news stories, congenial to the right was shared about 30 million times during the three months before the 2016 election, and those congenial to the left were shared about eight million times. And you end up going, yeah, but what's the base rate here? What's the sharing of news story in general here? Because 30 million sounds like a really big number.
00:30:11
Speaker
But is it big compared to non-fake news stories being shared? What's the base rate, Neil? And the other one is one study of online behaviour during the 2016 election found that Republicans over 65 were seven times more likely to share fake news on Facebook than people of any political leaning aged 18 to 29. Guess what's that seven times more?
00:30:39
Speaker
compared to. I mean, if it turns out not many people are sharing fake news at all, then seven times not many is still not going to be particularly big. There's a lack of base rate here, so he makes things look and seem really bad, but you actually don't know what's the base rate. What is the actual level here? Is it an existing problem and this shows it's getting worse? Or is it actually not much of a problem at all?
00:31:09
Speaker
Yeah, I guess if it's all about the evidence, then yeah.
00:31:18
Speaker
missing crucial bits out i think is yeah which is why i think there's a bit of a panic going on here in this case so look bad beliefs are an issue that need to be solved we need to show the scale of the problem and sometimes when you do that you end up taking a few shortcuts to make things look bad without actually explaining how bad they might be this is an example i use in my courses if you actually look at the
00:31:44
Speaker
base rate for getting lung cancer if you don't smoke. I mean it's fairly small. Smoking increases that risk significantly, statistically, but it still means you can be a lifelong chain smoker and it's still very unlikely you're ever going to get lung cancer.
00:32:03
Speaker
and so we have this idea in the medical literature that we don't tell people the base rate of smoking to lung cancer because you tell people the base rate people actually may not be likely to give up smoking as a habit when they go so you're telling me there's a one in one hundred thousand chance or one in ten thousand so there's a one in one hundred thousand chance if i'm a non-smoker of getting lung cancer
00:32:27
Speaker
and to 1 in 10,000 chance as a smoker that'll get lung cancer. I mean, those are pretty low odds. I might still have a few more cigarettes and I can think about giving up next week. Right, so where to from here?
00:32:42
Speaker
Okay, so the next part of the chapter looks at accounts, other accounts, as to why people profess bizarre beliefs, especially if those states don't govern much of their behaviour. So this is the point you made before. You're going to have this discussion, both with respect to Levy, and as we saw with Sharma, there is this puzzle. If people have
00:33:05
Speaker
bad belief, say an anti-Semitic belief. Why is it that most people never act upon them? So the example he has in chapter one is that you get these periods of very very high anti-Semitic belief. So he uses an example from the US where there was a rumor going around that most people in the town believed
00:33:27
Speaker
that the local Jewish bakery was engaging in child slave trading, and yet the only action people seemed to engage was to walk past the bakery slowly and glare at it. And as he points out, if you actually think that there's child slave trading going on, surely you would assume that someone would try to break into the bakery to stop it. But even though people profess to believe this,
00:33:56
Speaker
all they did was give a bit of a glare to the bakery when they walked past. So there is this particular puzzle. So he looks at some accounts that try to explain why people may not act upon their beliefs, and he's going to slowly cut away at these accounts in the hope that that will help him to establish his own particular account.
00:34:20
Speaker
So there's an assumption he has here that if you believe X, you should act according to X. So if you believe X and don't act according to X, you don't act upon your belief, then that needs to be explained. And so he covers the work of Hugo Mercier. And Mercier argues that professions of belief may sometimes function as a kind of commitment device. So you signal that you have a particular belief.
00:34:49
Speaker
or that people may be willing to accept certain claims because these beliefs have very little effect on their behavior so you claim to believe something because you know you don't have to actually change your behavior based upon that belief or that people accept and repeat rumors to justify how they wanted to act in any

Beliefs as Justifications: A Deeper Look

00:35:09
Speaker
case
00:35:10
Speaker
So there are post facto justification. I was going to do this thing anyway. Now what I've done is come up with a reason as to why I acted in that way to kind of give me a rationalization for something I was going to do. And so he uses the example of child slave trading at this particular point. But Neil thinks that this particular type of argument
00:35:32
Speaker
doesn't necessarily explain the kind of cases he's interested in, because as he says, but the justifications don't seem to have been inert. Without them, the ex may not have taken place, or may have been less widespread or less serious. For some people, and this is the however, they functioned as an excuse, but for those on the fringes, they may have functioned as a reason.
00:35:56
Speaker
So he goes, Mercier has an account that tells us why people may not act on their beliefs, and it covers a lot of cases, but there are some cases that it doesn't cover, and that's what Neil's going to try to resolve in the rest of the book. Right.
00:36:18
Speaker
We move on to... I actually don't know how to pronounce this name, so I'm going to... Nor do I! Yeah, I think it's Van Luyen? Van Luyen? Yeah. Sounds Dutch, I'm never quite sure.
00:36:29
Speaker
So Van Leeuwen has what is called a two-map cognitive structure. This is actually an account I know slightly better than the Mercier stuff. And this is the idea that sometimes people view the world through basically two frameworks. They have one map, which they take to be factual, and they have another framework or map of belief, which they take to be ideological.
00:36:56
Speaker
And so to quote Levy, Van Leeuwen suggested that those who reject the science of climate change might hold an essentially religious attitude towards certain factual propositions. He's right that there are good reasons to think that climate change skeptics often don't have very determinate beliefs. They seem to oscillate between believing that climate change isn't happening, that it's happening but we're not causing it, and it's happening and we're causing it, but it'd be too expensive to fix, depending on which is handiest.
00:37:25
Speaker
The fact that they move between incompatible propositions
00:37:29
Speaker
suggests that they don't have a very determinate or stable belief. And as he notes, they're not unusual in that sighting himself. Nonetheless, their behavior is best explained by something beliefy. Climate change skeptics aren't merely ignorant. They don't just fail to know the climate science is true. Most qualifiers skeptics, at least in part in virtue of holding a distinctive belief. And it's a belief with fairly precise content.
00:37:55
Speaker
even if it shades into imprecision when they attempt to flesh it out. The settled content is the content that's common to all the inconsistent propositions they oscillate between. They believe something along the lines of climate change isn't a problem we need to address. And unless we attribute the climate denialists a belief with the content along those lines, we can't begin to explain their behavior. Part of me thinks the issue here is, once again,
00:38:24
Speaker
Neil finds some beliefs to be bad and needs to explain why he thinks those beliefs are bad. Also, I'm quite happy to say, look, there are lots of people who have very, very inconsistent beliefs. Indeed, actually, the cog-sci stuff is very good at explaining that belief.
00:38:43
Speaker
can be compartmentalised such that people have very incoherent belief structures they don't think about much at all. So I'm not entirely sure this is as big of a problem as Neil makes it out to be.
00:38:58
Speaker
Okay, what's next then? Well actually so that now gets us to the other thing he wants to look at which is expressive responding. So Neil is rightly concerned that when we start polling people on what they appear to believe
00:39:17
Speaker
We can't necessarily accept the surveys or polls at face value because when people are being surveyed, they are responding to the survey instrument rather than necessarily reflecting what they actually believe. And there's quite a large literature on this particular issue, which some psychologists are very attentive to.
00:39:44
Speaker
And others think, well, there's enough surveys we can actually aggregate or watch the issue out. But the issue is sometimes when people respond to a survey or a poll, they are responding in a way where they're not indicating what they believe.
00:40:00
Speaker
but they are sometimes indicating what society wants them to believe, what they think the investigator wants them to put forward, or in many cases going, well I'm actually, I don't actually want to admit I don't know, so I'm just going to pretend I've got a solid belief in a particular position, because the shame of admitting that I'm actually not sure is so great that even on an anonymous poll,
00:40:24
Speaker
I can't click, don't know, or neither agree nor disagree. I've got to show some firm commitment to a belief." So he points out that taken together and despite some failures to narrow the partisan gap via the provision of incentives, the evidence suggests that a substantial number of survey respondents knowingly and deliberately misrepresent their true beliefs for expressive purposes.
00:40:52
Speaker
is this similar to the, maybe it's not the same thing, but analogous to the whole cultural Christianity thing in Britain of a few years ago, where people were saying, I'm gonna vote, I'm gonna say my religion is Christian, even though I'm not a religious person at all, because I believe I'm, because I think England is a Christian country, so I need to say Christian to sort of say I'm an Englishman. Yeah, and actually, that was something- It's signifying something rather than- Even Richard Daughton.
00:41:21
Speaker
Dawkins has said, you know, I don't identify as Christian, but I do think there's something about the Anglican value, which is important for being a British person. That's my impression of Richard. Flawless. I know. Like he was in the room. I know. I mean, I mean, I mean, imagine we could get Richard Dawkins and David Suchet on the same podcast. We could do it. No, we could. It would break conceivably. Internet.
00:41:49
Speaker
So yeah, so he talks a little bit about the problem of polling and I find this interesting because he quite rightly critiques the polls and how we can't just take the surveys at face value but at the same time he is going to take some of that stuff at face value and actually this is going to be a problem in chapter two in that he points out in chapter one
00:42:14
Speaker
the whole replication crisis in psychology. But in chapter two, he talks about how peer review and post-publication peer review basically gives us a very good reason to think that scientific reports should be trustworthy and treated seriously. So in this chapter, he goes, oh, we need to be very cautious about simply reading the scientific data. But in the next chapter, no, no, no. Once it's published and it's been through the review process, it's pretty much golden.
00:42:44
Speaker
just stepping on my lines there so talking your lines. All right so there are basically two more things I want to talk about because he now in this chapter presumably covers the material that he was going to write about originally until the pandemic made him rethink this and so he looks at deficit accounts, he looks at two particular accounts

Informed Bad Beliefs: A Paradox

00:43:09
Speaker
in the information deficit account and the rationality deficit account. So these are accounts say, look, the reason why people have bad beliefs is that they lack enough evidence. They just don't have the right kind of data to hand. So they're making poor beliefs because either they don't have enough evidence or they're not thinking through things properly. And he points out that the
00:43:37
Speaker
Information deficit stuff is kind of interesting because it turns out lots of people with bad beliefs actually seem to have quite a lot of knowledge about the areas that they doubt. And this is something that comes up in the climate change and creationist debates.
00:43:58
Speaker
It turns out that if you try to debate a climate change denier or someone who believes in the creation myth of Christianity, they often know quite a lot about how the climate works or about evolutionary biology.
00:44:18
Speaker
It doesn't seem that they have those bad beliefs because they lack evidence. They seem to have those bad beliefs in the face of a lot of evidence that would otherwise persuade other people.
00:44:33
Speaker
So it seems that we can't just say, look, they don't have enough evidence, therefore they have bad beliefs. Many of these bad beliefs are held by people who have more evidence to hand than people who don't have those bad beliefs. I remember once I was asked to debate a 9-11 truther, and this was a 9-11 inside job truther, and I went, look,
00:44:58
Speaker
He will know a lot more facts about the events of September 11, 2001, than I do. I'm not going to have this debate because even though I'm fairly sure the official theory, which is a conspiracy theory, is correct,
00:45:16
Speaker
I don't, he's going to gish gallop me on facts the entire time because he's going to say well yeah but what about this passport what about this particular brick on this particular bit of ground what about this photo oh you can't account for that photo well obviously your position can't be correct people with bad beliefs sometimes have a lot of evidence so he points out you know information deficit accounts aren't going to be good for the kind of examples he's interested in
00:45:43
Speaker
In the same respect, rationality deficit accounts aren't going to be as good as people make them out to be. And this is where we get a little bit of the old evolutionary psychology. And we're going to get quite a bit more evolutionary stuff in Chapter 2.
00:46:03
Speaker
But he talks about type one and type two cognition. So type one cognition, or what is sometimes called fast cognition, is often taken to be the opposite of type two cognition, which is slow and needs to be engaged. So type one cognition is often we take to be heuristics, the quick rule of thumbs that we can apply in any situation.
00:46:31
Speaker
Well, type 2 is when you actually have to start thinking through the situation. And so some people think the problem with bad beliefs is that people are relying on type 1 cognition, and they're not relying on type 2 cognition, i.e. people are thinking fast rather than thinking slow. And so here he looks at the work of Kehan,
00:46:57
Speaker
And Kahane suggests the greater polarisation, to quote Levy, seen among more capable and informed participants, is due to their greater capacity. This capacity gives them an ability less capable participants don't possess, to clearly recognise how threatening the correct response to their world view or identity is. They are therefore motivated to selectively inhibit type 2 cognition. So this is the idea that look,
00:47:25
Speaker
For some of these bad beliefs, people are aware that if they thought through their beliefs carefully, they'd have to reject those bad beliefs. So they're therefore motivated, in this case of motivated cognition or motivated reasoning, to only rely on heuristics and not think through the situation properly.
00:47:50
Speaker
And this is about which I ended up being quite confused as to exactly what's going on here because Levy then says there are grounds for skepticism about a central plank of accounts that turn on motivated cognition. On these views we are motivated to reject some hypothesis because it is threatening to our group identity or our self-esteem. So if you are someone who believes that climate change isn't real,
00:48:19
Speaker
then you want to reject the evidence because it threatens your identity as a climate change denier. And he says, well, look, the dual commitment of the right, so he uses an example here, to dynamism and to stability ensures that being a Republican has no determinant policy implications. From a contradiction, anything follows. He said, look, Republicans don't have enough of a self-identity because they're
00:48:49
Speaker
internal ideology is incoherent and thus it can't be the case that it's their self-identity which is causing them to go for fast cognition over slow cognition because that would require them to have a coherent sense of self and a coherent sense of identity. But because being a conservative is inherently
00:49:12
Speaker
contradictory, that's not going to explain why they prefer fast cognition to slow cognition. It's not going to explain why they prefer heuristics as opposed to slow and engaged thinking purposes. So because people are sometimes incoherent,
00:49:30
Speaker
That means that group identity isn't going to be a good explanation as to why they reject some beliefs and come to bad beliefs. And I was going, has he not heard of socially conservative and fiscally liberal conservatives? Because that is a coherent, I mean, it's not a political ideology I agree with.
00:49:54
Speaker
but I don't think it makes them have a contradictory belief structure. It just means they have a sophisticated one, not a simple one. Yeah, that does make me a bit suspicious straight away when people, I'm assuming Neil Levy is not himself a Republican. No. Or Knightwood Leaning. That's maybe suspicious when people say, people whose political views are the opposite of mine
00:50:20
Speaker
I'm just intrinsically nonsense in, yeah, that's, that's, that's. Yeah, and once again, kind of crazy in creation, Mill has identified beliefs that he thinks are bad, which aren't necessarily bad beliefs. Yes, yes, indeed.
00:50:40
Speaker
So yeah, so that's basically that chapter. He hacks away at existing accounts, but he doesn't really establish his own account just yet. For that, he's going to give us an evolutionary story. And Josh, I think you are best placed to guide us through.
00:50:55
Speaker
the story of cultural evolution and a lot of talk about apes. Yes. Yes. So this takes us to chapter two, culturing belief, where he's basically going to put forward the case that we have this sort of cultural knowledge and cultural knowledge accumulation
00:51:16
Speaker
system framework, whatever, that is the source of a lot of our beliefs. So he starts by saying that we often think the difference between humans and animals is that humans are rational, and it's a bit hard to quantify exactly because you can observe reasoning taking place in other non-human species.
00:51:40
Speaker
So it muddies the water a little bit as he puts it, there's no property that distinguishes all and only human beings from other animals beyond facts about descent. But there's something right about the claim that we're rational animals. Our intellectual capacities and achievements are distinctive and impressive. At the same time, there's also something misleading about the common picture of ourselves as rational animals.
00:52:00
Speaker
We think of our rational capacities as realized by our big brains and there are quite a few grains of truth to that thought. But our rationality also depends on our sociality and thinking of ourselves as cultural animals is no less accurate than thinking of ourselves as rational animals. And he brings up this idea of cultural knowledge by looking at a few examples of 19th century explorers who would go out into a foreign to them environment
00:52:26
Speaker
and do quite badly largely because they would ignore the knowledge of Indigenous people in those areas. And because they thought they knew better than these people who actually had, whose culture was grounded in these areas, who had knowledge of how to best live.
00:52:47
Speaker
identity flourish in those areas. They did not do examples of Amansen apparently did do well in his expeditions and was known for paying close attention to what the people who actually lived in the area that he was exploring said you should do.
00:53:08
Speaker
So the basic thing is cultural belief is a thing, and it's important. He says, cultural knowledge solves problems that are intrinsically difficult. When feedback is quick, individual cognition is often up to the task of solving problems. We rapidly learn to avoid suspension destroying potholes in the road or nettles in the bushes.
00:53:26
Speaker
But when the relationship between an action and its effects is slow to manifest and probabilistic, individuals do very badly on their

Cultural Beliefs and Problem Solving

00:53:33
Speaker
own. Think of how long it took to demonstrate the effect of tobacco on health. For decades, people denied the link between smoking and cancer because they were more impressed by salient cases of individuals who had lived to ripe old ages while smoking heavily, and of course because merchants of doubt deliberately muddied the waters.
00:53:49
Speaker
Science has developed mathematical tools for detecting signal and the noisy relationship between variables, like the relationship between smoking and cancer. Without such tools, individual cognition is highly unreliable, but cultural cognition often succeeds in identifying the signal amid the noise without the need for statistical tools. And so he's going to go on to talk about how these cultural beliefs help us avoid some of these potholes in ways that maybe we don't even know why. We'll get to this in a bit.
00:54:18
Speaker
But there are instances of cultural belief where people do things, and it turns out there's a very good reason for them to be doing those things, and there's a very obvious advantage to it. But they may not themselves know why they're doing these things other than because that's the way we just do things. Although one thing which I have about the way he gives that account is that he's going, well, look,
00:54:43
Speaker
People do things, they don't know why they're doing them, because when you ask them, they go, oh, it's just our custom. But of course, if you actually talk to anthropologists, they'll say, well, look, it may have been the case that they once knew exactly why they did things, but that knowledge was lost over time, but the practice was still encoded in cultural norms. So it might have been the case that, because he uses the example of when corn comes to Europe,
00:55:13
Speaker
you get niacin deficiency which is due to the fact that corn doesn't have all the essential minerals you actually want from a foodstuff. So it turns out you have to put an alkaline in to basically unlock the niacin that's actually in
00:55:29
Speaker
the corn itself turns out in Mesoamerica they were regularly cooking the dishes with ground up seashells and things which actually had that out those alkaloids in there to unlock the the necessary chemicals etc etc
00:55:45
Speaker
And he goes, look, they didn't know why they did it. It was simply something discovered over time. But actually they may have known why they did it. They may have been someone who in the past who went, look, if you eat this stuff without this stuff in there, it makes you go sick. I don't know exactly how that mechanism is, but that is the case. But it's too hard to explain to someone that there's some kind of weird property in this substance.
00:56:08
Speaker
which when added to that substance make this food safe to eat so we're just going to say look it's just it's just the norm when we cook this thing we always add this thing in and so the knowledge gets encoded and the justification is lost but it was actually known at some point
00:56:26
Speaker
Have you read Nation by Terry Pratchett? No, I haven't. One of his few non-discworld books. It's sort of an alternate history. It involves people washing up on an island in the 19th century sometime, I think, and sort of
00:56:43
Speaker
in the wake of a tsunami that's devastated the whole area and formed their own society. But it has examples of this. They make a sort of fermented drink, and the culture that this person washes up in has a particular song that they sing whenever they
00:57:02
Speaker
brew the drink, and that which is just sort of part of the ritual. It's just the thing they do when they make it. But he realizes that the point is it takes some time for the whatever the process to work. And if you if you drink it too quickly before this has taken place, it's poisonous, you need to wait for a certain amount of time. And the amount of time is the amount of time it takes to sing the song. And so obviously, the people who originally came up with it thought, OK, we need to time
00:57:29
Speaker
the brewing of this so that it's not poisonous. Okay, let's make up a song that lasts the right amount of time. And yeah, over time, people may have forgotten exactly why they sing the song, but they know they need to sing the song. Anyway, we'll get into this more in a minute. This leads into the next section of the chapter called Cultural Evolution.
00:57:53
Speaker
So humans are cultural animals, and again rationality isn't unique among humans, and culture is not unique among humans. Other species do have culture, or at least as Levy defines it, he defines culture as information that is acquired from others by vertical or horizontal transmission, i.e. from elders or peers, and which affects behaviour.
00:58:15
Speaker
Now he wants to say the difference is that human culture, in his words, appears to be cumulative culture. In our species, perhaps alone, certainly to an extent that is dramatically greater than any other, cultural innovations are not merely transmitted, they become a platform on which others can build.
00:58:32
Speaker
And it says there's good reason to believe that the mechanisms underlying cumulative culture are evolutionary. Evolution is substrate neutral and need to be limited to biological reproduction. Evolution occurs whenever roughly there is selection between individuals which vary in their characteristics and this variation is heritable. Some of his traits are differentially rewarded. These traits are heritable and the environment is sufficiently stable over time. We should expect evolution.
00:59:00
Speaker
He does point out that the current cultural evolution theory isn't memetics. Gives a bit of a talk about Dawkins and his discussion of memes. This isn't quite the same sort of thing. I was actually a little bit fascinated that he mentions Dawkins on memes but doesn't actually mention the person who
00:59:17
Speaker
is actually primarily responsible for the discussion of memes, because Dawkins just mentions memes and then moves on. It's Susan Blackmore, who was the one who actually did a lot of the theoretical work into memes, and that just isn't even in a footnote. Yeah, I don't know, maybe it's, he just knows that people associate the name Dawkins with it and wanted to give it an each, and rather than be accused of skipping it, I don't know.
00:59:47
Speaker
He does point out that cultural evolution can occur independently of gene transfer of biological evolution, but the two can go together. He gives the example of at least over time in lactose-tolerant, which is a biological evolution, but that came about at least in part to the increase in dairy farming, which is a cultural evolution.
01:00:16
Speaker
And he spends a bunch of time this chapter going over evidence for us being culturally evolving animals and talks about various characteristics of human beings that make us prone to or capable of cultural evolution. He talks about humans
01:00:35
Speaker
particularly long childhood and adolescence. Some animals are independent from birth. Others, especially mammals, require raising it by appearance, but get over that fairly quickly. But humans stay children for a long time. There's the whole skulls versus pelvis aspect to that as well. But
01:01:01
Speaker
He is Josh, which side are you on? Are you team skull or team pelvis? Oh, pelvis all the way. Really? Wow. Yeah. I would not have picked you as a pelvis man. Yes. What a sentence that is. As people point out, every day there is a new novel sentence being uttered in English. And I think I would never have picked you as a pelvis man. Maybe today's novel sentence.
01:01:31
Speaker
but for being the first person to say that we think it's the Russell Crowe philosophy also. So on particularly physically immature because our skulls are too big to fit out her pelvises, not her own pelvises obviously our skull fits out other person's pelvis.
01:01:47
Speaker
As well as that, our long adolescence is what he calls an apprenticeship in our culture. We need this amount of time being raised by people who will educate us in the cultural knowledge that works for the environment that we live in. Obviously, culture is relative to a context. Things that are good cultural practices in one area are going to be completely different from those in another.
01:02:11
Speaker
He talks about, he gets, it's very sort of anthropological when he talks about, it's all very much about that idea of, you know, not learning the right foods to eat so you don't get poisoned. And it's all sort of about survival in nature. Whereas I assume he would say the same practices exist in this day and age. And there'd be a lot more about social conventions and things like that are things that you need to learn. But he doesn't, he sticks to the very
01:02:42
Speaker
Yeah, anthropological kind of look to it. He also points out that humans are very imitative, more so than other animals, even to our detriment, apparently. He mentions experiments showing that children will, you demonstrate a behavior to human children, and they'll imitate that exactly. Whereas if you show it to other apes, perhaps they will realize that the
01:03:09
Speaker
behaviour you've taught them is good but suboptimal and we'll start doing it a better way. But he says that this what they call over imitation is an adaptation that lets us pick up cultural practices very very keenly and in detail I guess again the learning all the words to the song because that song is exactly the right length for the task that you're performing even though that's a fictional example that's the sort of thing.
01:03:37
Speaker
And here as we talked about the idea that we may know that we perform certain cultural practices and how to perform these cultural practices, but it's possible we don't actually know why. It gave an idea that I hadn't heard before, the idea of divination rituals, all the various rituals from cultures all over the world and over time of reading
01:03:59
Speaker
and reading entrails or the flight patterns of birds or reading bones or all that sort of stuff. He mentions a theory which I had not heard before, which is that the advantage of divination rituals of this kind is that what you're actually doing is just promoting random chance. And the advantage of that is that it prevents us from seeing, because humans like to see patterns and things, this is preventing us from seeing patterns that aren't actually there by
01:04:26
Speaker
unknowingly making sure we just do things at random.
01:04:30
Speaker
But anyway, that's just an interesting point. He says that we tend to defer to the customs of our culture, not entirely, obviously. If we were completely differential to the way things are, we would never innovate. We'd never change things from what they are. And one of the things he's saying is that's one of the things that's particular about human beings. We innovate and build upon what's already there.
01:04:59
Speaker
And it may seem irrational in some cases to defer to custom, to do things because that's just the way we do it. But later he says he's going to say that it's not actually that irrational a behavior. He points out that of course cultural conventions can be arbitrary. It may not be important the detail of any particular practice. What's important is that everybody does the same thing. And a good example of that is what side of the road to drive on.
01:05:29
Speaker
As is obvious, it doesn't actually matter whether you drive on the left or the right because there are countries where people drive on the left and there are countries where people drive on the right and it works just fine. The only important thing is that everybody in the country drives on the same side of the road. That's what avoids car accidents.
01:05:45
Speaker
And because of this arbitrariness, they can vary from place to place, and they can also be difficult to infer or to guess to someone who's coming into a new culture. You can't just step into a new culture and immediately understand the way they do things. It can be, you know, the idea of culture shock is a very real thing as anyone who's, this must be something you've experienced a fair bit moving from New Zealand to China.
01:06:09
Speaker
And also from New Zealand to Romania, and also sometimes coming back. Mum actually tells a great story. So she lived in Bangkok for about five years when she worked for CETA, the Southeast Asian Treaty Organisation, the companion to NATO, which doesn't exist anymore.
01:06:28
Speaker
And when she was living in Bangkok, when you drove on the road, you had your hand on the horn the entire time so people would be aware where you were. And so she returns back to Auckland. She's driving across the Harbour Bridge. She's going,
01:06:43
Speaker
there's some car making a really large amount of noise and then she realised it's her, she's got her hand on the horn and she's driving across the bridge because she's so inured to driving in Bangkok that she's forgotten what the cultural practice is in Aotearoa, New Zealand when it comes to...
01:07:03
Speaker
driving but she's also where there's a weird noise she shouldn't hear and it takes her a second to realize she's the she is making the noise she's the noise inside the car. Walls coming from inside the house yes.
01:07:17
Speaker
He then refers to what he calls social referencing, giving two examples of this. One is conformist bias, where we tend to exhibit a bias towards conforming to the local culture, when in Rome do as the Romans do. He also says we tend to have a prestige bias, which means we're biased towards imitating particularly successful individuals. And again, this is often can be in the case of
01:07:46
Speaker
We don't know why what they do works, but we know that they're good at what they do, whether that be a sports person or in business or anything, a hunter and a hunter gathering society or something. We know that a person is good at what they do, so we copy all the things that they do because we know they must be doing something right. So there's a lot of talk of
01:08:10
Speaker
of the way humans are in his world. And again, this is a bit of the old evolutionary psychology stuff here, which I gather you're not exceptionally fond of. And one does wonder how much of it's a post, you know, I think you call them just so stories, don't you? Just a post facto justification of a thing that we observe.
01:08:36
Speaker
But at any rate, he says, just how discriminating the mechanisms underlying culturally late evolution are is controversial. I've presented a picture that leans very heavily on one side of these debates. The scientists I've cited, Peter Richardson, Joe Henrich, Robert Boyd, are sometimes seen to constitute the Californian school. The soapball Paris school takes a different view on many issues, in particular how discriminating the mechanisms of transmission are. This isn't the place to address this dispute in any detail, but I will say a few words in justification on my reliance on the Californians.
01:09:06
Speaker
And he does, which didn't seem particularly germane to the stuff we're interested in here. He just talks about the difference between the two and why he thinks this one's more compelling. But he does say that the two schools aren't as incompatible as some people may think.
01:09:23
Speaker
So he concludes the section by saying, up to this point, we focused on cultural knowledge, knowledge of the behaviors that we need to function as members of a particular society and to flourish in sometimes harsh environments.
01:09:37
Speaker
The mechanisms we've examined have the function of enabling us to distinguish signal from noise in causally opaque systems or to identify regularities at temporal and geographical scales that exceed the grasp of an individual. I noted that prior to the development of statistical tools, these mechanisms were the only means we possessed for the detection of such signals.
01:09:56
Speaker
Now, of course, we possess the tools of science, which allow us to achieve the same sorts of ends much more quickly, efficiently and accurately. These tools can be deployed by individuals. Does that entail the end of the millennia-long age of deeply social knowledge? The next section will assess the extent to which knowledge is social today, in environments far removed from the environment of evolutionary adaptiveness.
01:10:19
Speaker
which leads us into the next section of this chapter called Science on Mars. And this is a reference that he talks about. He talks about Martian science, which is a reference to the film, The Martian, based on the book, The Martian. But in the film, I think it's a quote from the film that isn't actually in the book. There's the bit where David says, I'm going to have to science the shit out of this. And that's what he's sort of talking about, doing science.
01:10:44
Speaker
So he says, until the last decade of the 20th century, epistemology was a largely individualistic enterprise. It was primarily recognition of our pervasive reliance on testimony that altered the landscape. We are dependent on testimony, implicit and explicit, for our knowledge about the temporally and geographically distant, and for much of our knowledge about the unobservable posits of science. As an epistemologist yourself, does that sound right to you?
01:11:08
Speaker
Yeah, I mean, there is this interesting thing about epistemology, that the emergence of social epistemology, the idea that actually we're not just individual thinkers, but rather people thinking in groups and reliant on knowledge from other people
01:11:25
Speaker
Strangely, at least in Western philosophy, this is not necessarily true of all philosophies around the world, but in Western philosophy it was largely individualistic until the middle of the last century. And then people started going, oh, it seems that we actually get a lot of knowledge from other people.
01:11:45
Speaker
We should probably explain that. So no, I have no quibbles with that. I mean, there might be quibbles as to whether it starts with Tony Cody's book on testimony or whether there's other work which actually started that particular debate. But no, it is a late 20th century development that we went, oh, knowledge comes from other people. We should probably have a theory.
01:12:14
Speaker
Here's this idea that we rely on testimony from other people. In order to know stuff, in the social sense, we need to trust that this other stuff that we're drawing upon is just correct. Hang on, science isn't like that though, right? Like in science,
01:12:35
Speaker
You don't take anything on trust. You only believe what you can see evidence for. As he puts it, in science, our naked intelligence and our capacity to test the facts is all that really counts, right? That question mark, very pregnant.
01:12:52
Speaker
very pregnant question. He's going to say no, that is not the case. He points out group deliberation can often outperform individual deliberation. We talk about, we like to talk about the madness of crowds and moral mentality. Again, I think this is Terry Pratchett, it might be someone else. The
01:13:14
Speaker
What is it? The IQ of a crowd is inversely proportional to the number of people in it. There's this idea that the more people you get in a group, the dumber they get. And yet there's also, I've heard of this phenomenon, I forget the name of it now, that's the wisdom of the crowd or something like that. There's a phenomenon where if you get
01:13:32
Speaker
a whole lot of people to estimate a particular thing, whether it's, you know, like one of those fairground, the weight of a pig, or how many jelly beans there are in a jar or something. So if you get how much people to give an estimate of a particular account, and then average all of the guesses, the average tends to be pretty accurate to the actual amount.
01:13:53
Speaker
And so he's talking about this. He says that as individuals, we tend to fail certain logical tasks. He gives an example. And all those ones that when we talked about the Sherman, he had that example about the probability of conjunctions, which I was not convinced by because it seemed to me like they were acting like they were testing people's reasoning ability when it really seemed like people's language use was tripping things up. And this looked like a similar one to me. He has a footnote.
01:14:22
Speaker
It was basically giving people a conditional statement and then asking them how they would act upon that conditional. And people would often get it wrong because they would assume that the condition was a bi-conditional. You tell them if A then B and people would assume, okay, so A means B and B means A, which is not true. If A means B only goes in one direction, which to me
01:14:47
Speaker
to me that's just because people assume that we're in the same way that when we hear the word or we tend to think exclusive or even though in logical terms it's not there you go. That is a huge debate as to whether people think or is inclusive or exclusive. I agree with you, I think when people use or they're showing. Oh it depends on context. But I've had so many debate with philosophers you know or is definitely inclusive okay so yeah maybe
01:15:17
Speaker
It might also be interesting to see whether Aura is inclusive or exclusive in particular families of English. Are New Zealanders more likely to use Aura in an exclusive sense? And maybe Americans, where most of this logic and philosophy of language stuff comes out of, maybe they're more likely to use it in an inclusive sense.
01:15:41
Speaker
The point is, I'm just nitpicking here, the point is that there are these logical tasks that people do tend to fail for whatever reason. We're looking at it individually, but we're talking it over in a group, people tend to be right more often. Like the Watson selection test, which he actually uses in the book, and I use in class. That's the one I'm talking about now, I think. Yeah.
01:16:06
Speaker
So, he does point out though that group deliberation can go wrong. It can go wrong in ways that can be alleviated by individual deliberation. So he talks about information, cascades can mislead the group, powerful individuals can carry disproportionate weight.
01:16:22
Speaker
And people may self-silence in the face of prejudice or anxiety. All of these problems can be mitigated if people are disposed to give private information and their individual deliberation disproportionate way.

Group vs Individual Deliberation

01:16:33
Speaker
So he's saying that yes, group deliberation can work, there can be times when it fails, and in these cases when group deliberation can fail,
01:16:42
Speaker
individual deliberation, but people standing up and giving weight, perhaps even undue weight, to their own opinions can actually fix some of these problems. He looks at the idea of information cascades. I remember information cascades. We've talked about those a bunch of times, looking at other papers.
01:17:00
Speaker
Although I went back and looked at the notes of some of our papers to remind myself when we've talked about information cascades before and I found that people did actually tend to mean them to mean completely different things in a couple of instances. In some cases it was cascades of sort of inference. People believe A which therefore implies B which implies C which implies D and then other times when people are talking about information cascades though I mean person A believes this which causes person B to believe it which causes a single living.
01:17:31
Speaker
I think that's the sense that he's using it in this case. He says that information, initial information can color the assumptions of people going forward, which can then cause subsequent people to, you know, color their assumptions, which then colors the assumptions of people after that. He talks of the example of
01:17:49
Speaker
You've got a jar that you can't see inside that has white balls and red balls in a certain proportion and you're asked to, people will go and pick one out and then based on seeing what people pick out and what the people before them picked out they guess whether it's a
01:18:07
Speaker
one where there's like a high number of red balls or a low number of red balls or something. And he talks about, you know, that in situations like that, information cascading can cause things to go wrong because you may have, the jar may be mostly white balls with a few red balls, but if it turns out that just due to chance, the first couple of people pick out a red ball, that's going to color other people's thinking, okay, this must be the one that's got lots of red balls in it, even though that's actually not right. And so the point is, in cases like this,
01:18:36
Speaker
being overconfident in your own opinion can actually be a good thing. It can short circuit this sort of stuff, but if you get a seemingly sort of argumentative person, they can dig the heels and saying I'm right, sometimes that can be a good thing because it can stop this sort of cascade from happening. So he's sort of saying that we can deliberate in groups, we can deliberate individuals, there are
01:19:02
Speaker
He says there definitely seems to be a bias towards or an assumption in favor of, and I think he's talking about in the sort of, I don't know, popular culture, popular consciousness, a bias towards epistemic individualism, the whole lone genius myth. It's a story that people like to talk about, although the great man of history sort of thing. Many people like the idea of being able to point to these
01:19:29
Speaker
significant things happening because of a single individual. But as he says, the long genius myth is just a myth. As he puts it,
01:19:39
Speaker
Scientists too are dependent on testimony, explicit and implicit, even within the domain of their own expertise. Working scientists use tools they didn't develop and that they may not be able to fully understand, often applied to data they didn't gather, in which they can't verify, to test hypotheses that are constrained by theories they may not grasp. These constraints enable them to do science. He gives the examples of
01:20:04
Speaker
I forget the specific example, but basically you may need to use a mass spectrometer to do your work. You might not be able to build one. You might have an idea of how it works, but you're relying on this thing that as far as you know, might as well be magic.
01:20:23
Speaker
He carries on, since climate change denial is a principle exemplar of their belief in this book, let's take climate science as an example. He goes through this at this point. He talks about how climate science is heavily interdisciplinary with lots of different tools, techniques, and fields of expertise. In the particular case of climate science, you've got all sorts of stuff of geology and
01:20:49
Speaker
and atmospherics and just all sorts of different fields intersect to talk about the client and how the client is changing and what is affecting and what is causing the climate to change and so on. And so any individual working in the field of climate science is going to be relying on a bunch of work from a bunch of other people that they themselves would not be able to do.
01:21:11
Speaker
in many cases, and indeed will be using equipment that they themselves didn't build, will be relying on even simply basic maths and physics that's used by other disciplines.
01:21:32
Speaker
other than the ones that they themselves are experts in. He says climate scientists calibrate their findings and refine their models in the light of evidence from other fields, like every other science. Moreover, they rely on tools they didn't design, in which they make the skills to fully understand. But basically all of science depends on and all of the rest of science a lot of the time. And then he gets into the idea that, as you mentioned before, peer review and post-review. And he'll say that again,
01:21:59
Speaker
This is an instance perhaps of the individual deliberation sorting out potential problems with group deliberation because you'll get
01:22:08
Speaker
In many areas, you'll get people with very strong opinions in certain directions and often at odds with one another. And by subjecting things to peer review, that sort of ensures that all of the different viewpoints are going to have a go at something and know one thing in theory will come to dominate unless it really is backed up by all of the evidence.
01:22:31
Speaker
And yeah, as you say, this does contradict a little bit with the stuff he said in chapter one. He basically says that peer review makes everything all right, that we have these things. Because he's saying it here in the context of how science very much is not an individual activity. Every individual scientist depends on
01:22:52
Speaker
all of science to produce the things that they do and part of that is the peer review process by which lots and lots of individuals, and of course he also points out that it's becoming more and more common for various scientific papers to have greater and greater numbers of authors involved in the individual paper.
01:23:10
Speaker
Yeah, I mean, he points out that the paper that announced the Higgs boson has five thousand. Five thousand, yes. Which, if you are, if you, if you're like me and you'd like to put all of your articles into a citation manager, you better hope the cut and paste is going to do a good job of putting that information in because having to type in five thousand names into your citation manager is not going to be fun.
01:23:37
Speaker
Yes, I mean, it almost sounds like you're just including your references all as individual authors, but I assume it's more complicated than that. So the whole point of this thing is to say that although, as you see right at the start, it looks as though perhaps on the surface of it, science has these tools which can be used by individuals
01:24:02
Speaker
to come to the sort of knowledge that prior to the advent of science, cultures could have taken a long time to do and perhaps less reliably. So does that mean that cultural knowledge is on the way out and is his whole point? No, no, no, it's not because there is no such thing as the lone scientist. It gives the example of Heisenberg, wasn't it, who
01:24:30
Speaker
So we famously went off to an island while he was coming up. Deep thoughts, deep thoughts about quantum mechanics. Deep thoughts on his island. But he wasn't actually coming up with the stuff entirely on his own. He was engaging in conversation, in correspondence with other army experts in the field and things like that.
01:24:52
Speaker
And so at the end of all of this, he concludes this section and this chapter by saying that we're the type of rational animal we are because we are cultural animals.

Cultural Knowledge and Rationality

01:25:03
Speaker
We possess a suite of dispositions that orient us towards others epistemically. For us, knowledge production is an essential respect, a product of the distribution of cognitive labor. Distributed cognition is more than merely a very efficient way of exploring the logical empirical space and mitigating or even harnessing our biases.
01:25:20
Speaker
It opens up new cognitive horizons that would otherwise be entirely inaccessible to us. Science does not free us, the animals we are, from epistemic dependence. If anything, it increases it.
01:25:32
Speaker
So that's the end of that chapter. This seemed very much like a scene-setting chapter. He certainly doesn't come to any conclusions in this chapter about bad beliefs and why I assume this is something he's setting up so that he will then build on in the chapters we have yet to look at.
01:25:56
Speaker
uh why exactly we can then come to what he's going to call bad beliefs. Yes so he's going to say well look we've got there's this cultural transmission of knowledge thing there's this epistemic dependence when it works well we get science when it doesn't work well we're going to get bad beliefs but they're generated using the same processes
01:26:21
Speaker
So then the question is, what is it about the environment that some people are in that leads some people to science and some people away from science? But that will be for next time when we look at chapters three and four, just as a preview. Chapter three is called How Our Minds Are Made Up and chapter four is called Dare to Think? Question mark.
01:26:45
Speaker
Isn't that a book by Richard Preble? I've been thinking again. I had another thought. I'm quite happy that I do not know the names of any of his books.
01:26:58
Speaker
Oh, so I mean, this is a deeply New Zealand thing, but when Richard Preble was the leader of the Association of Consumers and Taxpayers Party, he had a kind of, I mean, it's not really a rot because it's more just a, he would send people a copy of the book and then demand that people pay for it. So if he wanted to keep the book,
01:27:22
Speaker
you then had to send $20 to the Act Party. And as people pointed out, if you receive a book for free in the mail, you're not obliged to send it back. You can just keep it. There's no way they can enforce then, oh, if you want to keep this book, you have to buy it. But that was what Richard Preble did. He just sent copies of the book to random New Zealanders and expected them to then pay him $20 if they wanted to keep the book.
01:27:51
Speaker
Anyway. That was the man who deregulated our economy. He sure did. So that's the end of this episode. Next week will be more of the same and the week after that will be more of the same again. The next fortnight will be more of the same. Sorry. I keep saying next week when I mean next episode. I mean, although maybe that suggests that we're doing a secret podcast.
01:28:15
Speaker
maybe it does yes on the side on the off weeks yeah but what i can say for sure what we're doing is a bonus podcast for our patrons the most beloved of all uh so this week um
01:28:32
Speaker
It's been a funny fortnight. We should probably talk about the unfortunate individual who self-immolated outside the Donald Trump trial. There's a bit of stuff about Russia and Zelitsky and so on. We might mention Project Alpha. Last episode, we talked about mysterious projects. Possibly we've got another one who can say.
01:28:59
Speaker
But until then, I think that's it for this main episode for the Fortnite. So I will close it out in the traditional manner by saying goodbye. Goodbye. The podcast's Guide to the Conspiracy stars Josh Addison and myself, Associate Professor M.R.X. Denton.
01:29:21
Speaker
Our show's cons... sorry. Producers are Tom and Philip, plus another mysterious anonymous donor. You can contact Josh and myself at podcastconspiracyatgmail.com and please do consider joining our Patreon.
01:29:51
Speaker
And remember, they're coming to get you, Barbara.