Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Neil Levy's "Bad Beliefs" 2: Believe Badder image

Neil Levy's "Bad Beliefs" 2: Believe Badder

The Podcasterโ€™s Guide to the Conspiracy
Avatar
422 Plays7 months ago

Josh and M look at the middle chapters of Neil Levy's book - is this the Empire Strikes Back of this series or is it the Die Hard 2? Or the Temple of Doom? Or the Two Towers? Or the 1993 Drew Barrymore supernatural horror thriller Doppelganger? Yes.

Due to recording difficulties, the sound quality may dip from time to time.

Recommended
Transcript

Crafting a Parody of 'Bad Romance'

00:00:00
Speaker
Oh, why so glum, you? I've just spent two weeks trying to work out a way to rewrite Lady Gaga's Bad Romance to be about the book Bad Beliefs. And I'm still stuck on how you can't quite get Bad Romance to scan as Bad Beliefs, because even though belief is two syllables like romance, you have to sing it like belief, and that just doesn't work. Well, not every intro can be rendered into a soul.
00:00:27
Speaker
You don't get it. I've been working on this 24-7 for two weeks. I have done nothing else. I've consulted linguists. I've talked to musicians. I've even tried contacting Lady Gaga's representatives, and I've got nothing. I've not even read the chapter I meant to be covering this week. That doesn't sound like good use of your time.
00:00:49
Speaker
I know! I spent all of yesterday in class ostensibly teaching students about what social scientists believe about conspiracy theories, but actually trying to get them to harmonise the chorus of bad belief in such a way that I can move on to verse one of the song. You just want an excuse to say, I want your leather studded kiss in the sand on a podcast, don't you? Yes, and now you've said it, so what's the point? Also, I want your psycho, your vertigo shtick, want you in my rear window, baby, you're sick. Stop it.
00:01:24
Speaker
the Podcastors' Guide to the Conspiracy, featuring Josh Edison and M. Dintus.

Obscure Action Films and 90s UK Film Critique

00:01:37
Speaker
Hello and welcome to the Podcastors' Guide to the Conspiracy. In Auckland, New Zealand, I am Josh Edison, and in Zhuhai, China, life is a bitch, and her name is Maria. Let's associate Professor M. R. X. Dintus.
00:01:50
Speaker
Do you remember that one? No, I do not. I feel like I don't. Do you remember Killing Time by Neil Marshall? I don't think he directed it. I think he wrote it. Yeah, I do remember Killing Time by Neil Marshall. Is that what we get in it?
00:02:12
Speaker
I don't think it does, but it is a much better film than that tagline would suggest. I remember there's a play 90s obscure action films. Yeah, and UK obscure action film. There's a Paul McGahn obscure action film from around about that time, which is noticeable for its very bad editing
00:02:36
Speaker
in that a character gets shot in the head in one scene, and then gets up and walks around in the next scene, and then the scene later is back dead on the ground. I don't remember that one. Probably not killing time, actually. No, no, that was actually good. Probably not killing time. Yeah. Anyway, what have you been up to in the last two weeks?

Academic Paper Submission and Peer Review

00:02:57
Speaker
I have submitted, or at least, as you know, so I have personally submitted one paper, and I've been a co-author on another paper that has been submitted. So it's all been about the papers, Josh. All been about the papers. I probably shouldn't say anything about water in these papers, for the sheer fact it is possible that people who are listening to this podcast may well end up, may well end up, may end up, may well end up,
00:03:25
Speaker
being peer reviewers on those papers. So all I can say is if a peer reviewer out there, be aware that there are two papers, one solo authored by me, one co-authored by me, that are currently out there in the wild. Be kind or be cruel. You be you. But do they do this sort of double blind thing where they don't know that it's your paper that they're reviewing?
00:03:51
Speaker
yes yes so they won't know but if i if i say if i start describing what is in the papers yes actually yes actually that's probably the paper that was written by one em dentist or co-authored by one em dentist or maybe even two em dentists pretending to be separate co-authors maybe i should co-author stuff with myself just to make it more confusing yeah
00:04:14
Speaker
I think you just need to make sure that every chapter, like if you take the first letter of every paragraph it spells out, this was written by Emdenteth or something, just to really give the game away. Make things a bit more fun to write as well, a bit of a creative challenge for you.
00:04:29
Speaker
It's true, although I suspect it would just end up being endless

Creative Secret Messages in Writing

00:04:34
Speaker
paragraphs. What is every paragraph? What is one paragraph of P and another with E and then one with N and then one with I and then one with S? And it goes back to P and then E. Very strange. That would be your calling card though. And there's another paper I read recently and the first paragraph is C and the next paragraph is O and you just know where it's going.
00:04:56
Speaker
Yeah, yeah. Anyway, that's enough. Enough movie references and enough puerile humour, as if there were such a thing as enough of either of those. But anyway, we do have an episode. We have plans for an episode to record. We had serious business to engage in. Serious, serious business.
00:05:12
Speaker
I mean, I guess it kind of is, yeah.

Neil Levy's 'Bad Beliefs' and Movie Metaphors

00:05:15
Speaker
So we're into the middle. This is the Empire Strikes Back of Neil Levy's book Bad Beliefs. We're into the middle chapters. Are you suggesting that Die Hard 2, Die Harder, is the Empire Strikes Back of the Die Hard films?
00:05:32
Speaker
Well, no, because they weren't planned as a trilogy, or at least they weren't. Star Wars was not planned. But it was. By the time it got to Empire Strikes Back, they knew they were going to do a third one after that. So they came with Die Hard, which was just a cation sequel after a cation sequel. Anyway, you've done it again. I blame all of the movie talk on you entirely.
00:05:54
Speaker
No, it's because the title of this episode is a diehard reference, and yet you're making the claim this is the Empire Strikes Back of the book. Well, I don't know. We'll have to see if we get to the end. You are mixing your movie metaphors, and I don't know whether I'm here for it or not.
00:06:11
Speaker
Well, yeah, we'll just have to see whether this, whether we get to the end and it's more of a Han Solo being frozen in carbonite situation or more of a William Sadler doing Kung Fu in the nude situation.
00:06:25
Speaker
Well, see, I thought you could do something along the lines of it was going to be Solo, a Star Wars story, or Rogue One, a Star Wars story. Stick to one series of films. Don't mix your metaphors. Okay. Okay. Okay. Well, let's do away with metaphor entirely. Play a chime, and then start talking about smoking in. Actually, you're editing this episode, so you play a chime. I will.

Communal Thinking and Lindsey Graham's Shift

00:06:57
Speaker
So chapter three, chapters one and two were kind of just getting things going. Although chapter two, by chapter two he was already setting up this idea that he wants to talk about, emphasize group cognition and that being the way forward. So I don't know, we'll see. You of course will be talking about chapter three, I'll be talking about chapter four. Slight spoilers, chapter four I think is definitely going for the
00:07:24
Speaker
group cognition side of things, but what does chapter 3 have in store for us? You mean chapter 3, how our minds are made up, or how Neil Lethy's mind is made up about communal thinking, which is not much of a burden, it's just an accurate description of what goes on in this chapter. So this is a weird chapter. It's a weird chapter because I agree with a lot of it,
00:07:49
Speaker
I just don't agree with the kind of generalizations he draws from this discussion. And that's because Neil starts off talking about Lindsey Graham.
00:08:03
Speaker
And I'm going to end up talking about Tucker Carlson because I think the kind of story Neil wants to tell in chapter three and also in chapter four and the kind of story he told in chapter two has a very particular view on when people seem to express what we take to be irrational beliefs, which is that they are sincere beliefs
00:08:28
Speaker
and they appear to be irrational, and Neil wants to explain why they're not actually irrational at all. I'm going to argue at the end of this chapter that there's another option which Neil admits might be a possibility, but he just quietly shoves to one side. So he starts off the chapter by engaging in what is going to be a kind of potted defense
00:08:52
Speaker
of the forever trumpers. So he starts with, in this chapter, I'll focus on our epistemic difference in belief acquisition and update. We defer to others so ubiquitously and so routinely, we fail to notice when it occurs. We fail to notice our beliefs are dependent on what others believe and shift as theirs do. We fail to notice this even when the changes are abrupt. These shifts may appear arbitrary and irrational.
00:09:21
Speaker
I'll show there neither. In deferring, we follow the evidence, though that fact may not be apparent to ourselves or to others. I'll begin in an apparently unlikely place, support for Donald Trump.
00:09:35
Speaker
What? I know. Well, actually see that unlikely. I mean, this is a clever thing to do in an argument of this kind, which is to go, look, I'm going to give you an example that you think is going to be, aha, Neil, you're wrong. But actually, I'm going to explain to you that actually, aha, Neil, you're right. So finding the worst possible example for your case and then showing it actually works perfectly with your argument is a very clever thing to do.
00:10:05
Speaker
if your argument works. So he talks about how in the lead up to the Republican nomination, originally Lindsey Graham was a never-Trumper. He said that Trump was a terrible person to be president, he didn't have the right kind of mental characteristics to be president, he didn't have the kind of political nals to be president. He would be a bad choice to become the president of the United States of America.
00:10:31
Speaker
And yet as soon as Trump got the nomination, suddenly Linty Graham was talking about Trump as if he would be the perfect president for America, the best president America could possibly have and became an all-out booster for Trump. So I'm going to call this a forever Trump-er rather than a never Trump-er.
00:10:50
Speaker
And as Neil points out, we need to explain, how can someone like Lindsey Graham go from saying Trump is terrible

Changing Beliefs: Evangelicals and Trump

00:10:58
Speaker
to saying Trump is great? And he also- Well surely the answer is hypocrisy, and the chapter ends right there in the air.
00:11:06
Speaker
Well yeah and we're going to get to hypocrisy because I don't think he pays hypocrisy it's due but we'll get to that in just a minute. So the other example he uses is how evangelicals turn their nose up at Bill Clinton when the
00:11:25
Speaker
Monica Lewinsky's stuff came out, and yet didn't apply the same standards to Trump when his sexual proclivities were revealed to the world. He said, well look, how is it that these people seem to have changed their beliefs so radically, and can we explain that in a rational sense?
00:11:45
Speaker
Now, he admits at this stage there might be another explanation that is outside of epistemology, which is that there might be a political reason that people change their beliefs in this way. But he's going to use these examples in support of his theory about how we outsource beliefs and outsource belief production to go, look, these people are sincere in their beliefs. They have changed their beliefs.
00:12:15
Speaker
and it turns out they're not acting irrationally because we are so reliant on the belief of others that it's actually rational to defer to others and thus it's rational to change your mind as time goes by and also it's rational to not realize that you've changed your position over time.
00:12:38
Speaker
So he's going back to stuff he talked about in Chapter 2, where he was talking about how scientists routinely outsourced their beliefs and their belief production to other scientists on the regular, and he restates a key premise of Chapter 2 by stating
00:12:55
Speaker
Scientists outsource belief production and they outsource beliefs themselves. It is obvious how they outsource belief production. They depend on other scientists, both those working in their discipline and those in other disciplines for the data and theories that are input into and constraints on their work.
00:13:15
Speaker
The outsourcing of beliefs comes in two varieties. First, scientists may rely on others to believe things on their behalf. Second, they may outsource beliefs they themselves hold by relying on others or even the social or physical environment to ensure that cues that trigger those beliefs are made available at appropriate times.

Science, Beliefs, and Epistemic Methods

00:13:37
Speaker
Scientists outsource beliefs in this kind of way when they work in interdisciplinary groups. Science institutionalizes the outsourcing of beliefs. Members of different research groups rely on one another epistemically, thereby ensuring that beliefs are more salient to some members than others are given their appropriate weight and role.
00:13:59
Speaker
And the example here he uses, I found a little confusing. So he uses an example that people in Australasia will be very aware of, CFCs and the ozone hole over the southern pole of the planet Earth. So for North Americans, and I'm sorry, not actually just people in the northern hemisphere, not just actually, I mean to say people in the northern hemisphere, I just jumped to America immediately, I apologize.
00:14:27
Speaker
For many people in the northern hemisphere, they don't even know about the CFC issue and the ozone hole that appeared over the Antarctic, then New Zealand, then the south of Australia during the 80s and 90s, but basically the story goes like this.
00:14:48
Speaker
We used to use carbon fluorocarbons in our refrigeration units around the world. It turned out that when you, say, demolished a fridge or you threw a fridge away or you had a fridge replaced, those CFCs were nicely contained in the back of the call-out system of your refrigerator.
00:15:06
Speaker
they escaped into the atmosphere. And then when they escaped into the atmosphere, due to Corolla's forces, they all went to the South Pole. Where it turned out, they slowly ate away at the ozone layer over the South Pole, eradicating the ozone layer over the South Pole, and then over our country, and then over the south of Australia, making it very, very likely that Josh and I are one day going to get skin cancer,
00:15:35
Speaker
and die prematurely due to a mole on our face or back. Pretty much. And, yeah. Now, at the time people were reporting,
00:15:48
Speaker
unusual amounts of CFCs and an unusual lack of ozone over the Southern Hemisphere, there was a big scientific debate as to whether this was a physically possible process. And it turns out that the modelling indicated this couldn't happen.
00:16:08
Speaker
And then the data indicated it was happening, but then there was a second set of data that said no, the first set of data is wrong. And it turned out that actually the model was incorrect, the data was correct. And the so-called countervailing evidence was actually due to a methodological flaw in the way that things were being measured.
00:16:30
Speaker
in that when the NASA satellites were measuring ozone over the Southern Hemisphere, they were disregarding what they thought were anomalous readings, where it turned out the anomalous readings were, there's not much ozone there where there should be. And they were going, well, that obviously is an error. We'll throw that data out. And actually it turned out they should have been including that data in their data set in the first place. And Neil goes, look, this is a great example.
00:16:57
Speaker
of science actually figuring out a problem and then resolving it because the results of realizing that actually CFCs cause ozone depletion meant that we basically don't use CFCs in refrigeration anymore, which is why the ozone hole is getting smaller over time, although it has occasional
00:17:19
Speaker
periods of enlargement. I think the last two summers in New Zealand, the ozone hole has been thinner than it has been in decades leading to an increased race. An increased case of rices, peanut butter cups and also skin cancer.
00:17:36
Speaker
So yeah Neogos this is a great example of science going well and yet part of the story here is actually it's a little bit of epistemic luck that someone went actually we should just check the methodology of the satellite stuff because
00:17:52
Speaker
One of the sets of data might be wrong and they play skepticism to what turns out to be the data set that is actually wrong. So he's going, this is a great example of science working well. And I'm actually not entirely sure this is a very good example of science working well. This is an example of science
00:18:11
Speaker
accidentally working out what went wrong and then resolving it.

Skepticism in Science Methodology

00:18:16
Speaker
But it doesn't seem like it was actually part of the scientific process that they were actually going to get to that particular conclusion. Which is to say, part of the problem in this chapter, when science works, Levi says it's great,
00:18:30
Speaker
When science doesn't work, he applies proper caution towards where we need to be skeptical of the data. He does seem to be cherry picking the scientific examples that work for him. Yeah, I think we'll get to chapter four as well. He does go back and forth a little bit on when it's good and when it's bad and when it's working and there seems to be a decent amount of it depends that comes in from time to time.
00:18:57
Speaker
Yeah, and there's no theory here that allows us to demarcate between when science is working well and when science isn't working well, other than, this example works for my argument, this example doesn't. So yeah, this is going to give us a kind of general tenor, I think, of both this chapter and the next one. But he then moves on in the section called belief is shallow to the

Truth-Seeking Senses and Evolution

00:19:26
Speaker
naturalized epistemology argument. So let me give you the quote and then we'll talk about the problems of naturalized epistemology. So he wants to talk here about how belief is shallow, that actually most of our beliefs are very shallow copies of how the world actually is. So he says, the shallowness of belief is an instance of a more general phenomenon, the adaptive outsourcing of cognition to the world.
00:19:54
Speaker
Evolution is sensitive to small differences in costs, and it is cheap to outsource representations. Why go to all the trouble of building a model of the world, for instance, when the world is easily available to represent itself? As an added bonus, the world is a more accurate representation of itself than any model could be. It represents itself without any loss of detail, on a handy one-to-one scale, and updates in real time.
00:20:21
Speaker
As Rodney Brooks famously put it, the world is its own best model.
00:20:25
Speaker
So long as the cost of accessing the world are not significantly higher than the cost of accessing an internal model, and they may often be lower, not higher, we ought to expect the job of representing the world to be taken on by the world itself. That is, rather than consult an inner model, we should expect organisms to use sense perception to track the world and how it changes over time.
00:20:52
Speaker
And this basically is the argument from naturalized epistemology, which was a epistemic thesis that was particularly popular in the latter part of the 20th century that says, look, the reason why we should be externalists, people who think that actually justification is a process which might be both internal to the mind, but also depends on external factors out in the world, is that our sense organs have evolved over time
00:21:21
Speaker
to be truth-seeking sense organs. Our eyes are there to detect what's in the world. Our tongues are there to be able to taste and detect various flavors or toxins. Our hearing is there to be able to experience the audible world, etc, etc.
00:21:39
Speaker
So the evolutionary argument is that our senses evolved be truth seeking, so we don't need to develop sophisticated internal models of the world, we can simply rely on our sense apparatus gives us access to the most accurate model of the world, the world itself.
00:21:57
Speaker
And the problem is there are two arguments which he doesn't really talk about that is a problem for this kind of naturalized epistemology.

Perception vs. Reality Debate

00:22:08
Speaker
The first is what's called the epistemic-ontic gap, which is
00:22:13
Speaker
there's a gap between our ability to perceive the world and being able to make claims about the ontology of the world. So being able to perceive things in the world doesn't tell us that's the way the world actually is, it just tells us we perceive the world in that particular way. We can't move from
00:22:33
Speaker
our epistemic claims to our ontological claims is a gap between epistemology of sensing how we think the world is and an actual ontological model of how the world actually is. And this is a problem for naturalized epistemology
00:22:50
Speaker
because there's a rival thesis that would be put forward by evolutionary biologists, which is, you fool, why would you think your senses evolved to be veristic or truth-seeking? Your senses evolved to avoid harm
00:23:06
Speaker
and to increase your survival and reproductive success. Your eyesight simply needs to be good enough to be able to navigate the world, eat food and have sex. There's no reason why those senses have evolved to then be able to detect fine-grained detail about the world. That might be a consequence of evolutionary pressures, but there's no reason to think that's what your senses evolved to do.
00:23:33
Speaker
And it would be naive to think our senses are a good way of looking at real models of the world under that particular framework. And Neil doesn't talk about this stuff at all, that actually the evolutionary argument for epistemic success is something which is highly contested by epistemologists.
00:23:55
Speaker
Yeah, there's a quote, it's one of Jason Pargent's books, I can't remember, where someone's talking about how what we think of as the totality of everything is basically like a 270 degree cone of a very limited section of the electromagnetic spectrum plus vibrations on a fairly limited frequency of sound waves. Yeah, yeah, there's so much about the world
00:24:24
Speaker
We don't have sensory access to, but yeah, it seems hard to be able to say that that's all you need. Yeah. So I do find this entire section, and things, I mean, I'm sympathetic towards naturalized epistemology. So even though I've just said, look, there are two major issues with it. I still, because I'm largely an externalist and I'm mostly a reliableist,
00:24:52
Speaker
I tend to find that something along the line to very sophisticated naturalized epistemology is probably what I take to be a nice epistemological model. I just think the epistemic onto gap means we need to be very cautious about making claims, about having knowledge about the world. But I'm also one of those very annoying epistemologists who goes, well, I'm all here for justified belief stuff, the justified true belief stuff. I like to leave that to one side because I have no reason to think my beliefs are true.
00:25:21
Speaker
but I do have a reason to think my beliefs are justified. The other thing which this section annoyed me is that he doesn't do a very good job of explaining the difference between internalism and externalism.

Internalism vs. Externalism

00:25:35
Speaker
So he's talking about how it's great our
00:25:37
Speaker
Our perception allows us to see the accurate model of the world, which is an externalist thesis. It doesn't really talk much about what internalists think about these particular claims, where the claim is actually justification goes on within the mind, regardless of what's going on in the external world. And there's no real discussion of that. It's just assume the audience member knows what an externalist specia is and why it's going to be a good one.
00:26:07
Speaker
That being said, he does use this to talk about how we are actually easily fooled by information in the world. And this chapter is filled with the word cicade. Cicade occurs everywhere. He loves his cicades and we're going to see it here because it says, a central piece of evidence that our visual representations are extended consists in their shallowness where representation is shallow if it is easily uprooted.
00:26:35
Speaker
If beliefs are extended, then we should expect to see evidence that they are shallow in similar ways.

Perceptual Illusions and Saccades

00:26:46
Speaker
We take ourselves to have rich and detailed internal representations of the visual scene. But careful experimentation shows that we actually retrieve a great deal of visual information by rapid and unconscious saccades, as and when we need it.
00:27:02
Speaker
As a consequence, our visual representations are shallow. Under appropriate conditions, we may fail to notice the substitution of one person for another because our visual representation is, largely, though of course not entirely, stored outside our heads. Similarly, we take ourselves to have stable beliefs. But if our beliefs are shallow, then we should expect them to be easily uprooted through analogous processes.
00:27:29
Speaker
I see you're doing a bit of searching there in the background. Were you looking up cicades, Josh? I had. The dictionary says it's pronounced saccade. Really? I don't know. As in John Luke saccade? Something like that. So it is not, but interestingly enough, a saccade is not only the series of small jerky movements of the eyes when changing focus from one point to another. It's also the act of checking a horse quickly with a single strong pull of the reins.
00:27:56
Speaker
Really? Think about that. I am. I'm checking, I was going to say checking a horse quickly with your eyes because then maybe that's where the, so yeah, how those two words are being used for two different things.
00:28:12
Speaker
I don't know. Although I guess if you're checking a horse with a pull of the reins, it is a sudden movement. So maybe it is. Yeah, maybe. So the idea with cicades is actually, we kind of think that our vision, when we kind of move our eyes around to focus on scenes, we kind of slowly making our retina move from one point of the screen to another. But actually, it turns out our eyes are always twitching. If you're taking a large scene, your eye is twitching.
00:28:38
Speaker
upwards and downwards, side to side, to gather as much data as possible, in part because the colour part of our vision is incredibly tiny. So if we actually kept our vision straight, you'd have this black and white widescreen with a kind of 4x3 colour inset in the middle.
00:28:59
Speaker
And so because we aren't aware our eyes are twitching all the time, very rapid movement, it's actually possible to produce experiments where you can fool people into thinking that they're seeing something they're not. So the example that Levy uses in this chapter is that if you get someone's cicade properly down, you can get them to read a piece of text
00:29:26
Speaker
And as long as you keep the words of the sentence in the right location, you can change every single other word on the page. And people will think they're reading a piece of stable text, even though as soon as their eye moves away from the first word they've read, you can replace that word, as long as the eye just continues tracking the sentence.
00:29:47
Speaker
You can change every single word on the page and people will not report that changing because you've got it in sync with the cicade. The example I use in my Critical Thinking class, you can take very finely detailed color images which appear to be animated
00:30:08
Speaker
which actually turn out to be still images, they simply use the fact that because your eyes are moving constantly to capture as much color detail as possible, it makes the image look like it's moving when it's not. And so these are fairly common visual illusions that you can demonstrate in labs to go look
00:30:26
Speaker
you think you're taking in a wonderfully detailed scene very accurately but actually it turns out visually you are very very easily fooled because you don't know what your eyes are doing.
00:30:39
Speaker
And it's just a matter that so much of the image we get is, there's a lot of post-processing done by our brains, basically. There's the various, the optical illusions where you have sort of arrow-type shapes and then straight lines drawn across them, and the lines appear to be bent, and that's because essentially your brain is predicting the future. It sees the arrow and thinks, okay, it's moving in that direction, so assumes the line will be going in that direction, even though it's not. There's all sorts of ways.
00:31:08
Speaker
that what you see is not necessarily what it is. Everything we see is upside down and yet somehow the brain goes, I don't want to look upside down. I'm going to rotate everything, which is why you can give people glasses that they've
00:31:24
Speaker
And after a few minutes, you are able to operate in the world normally again. And then you take the glasses off and then you get the same kind of discomfort of having to readjust. Yeah, there's a lot of post-processing that's going on in the brain. Did we get the term post-processing before or after computers? I don't know. Post-processing is a filmmaking term that probably predates computer effects.
00:31:51
Speaker
Yeah, no, you're right. So did we get that term before or after we started doing visual medium stuff? So yeah, so he does use he does use this shallowness idea to go look, there are problems with having shallow beliefs, which is why we need to rely on other people.

Financial Incentives and Argument Shifts

00:32:13
Speaker
If our beliefs are shallow, and we only use our own
00:32:19
Speaker
to do things rather than work within communities, that's why things are going to go so astray. These visual illusions only work because they're individuals looking at a scene. As soon as you get other people involved, people start spotting the problems and then go, actually, that's a visual illusion there, you then gain knowledge that you are being fooled.
00:32:41
Speaker
And he also points out there's some really interesting material on how you can manipulate belief via incentives. And he uses here an interesting example, which is getting students to write an essay whereby they are asked to argue for increased tuition costs.
00:33:02
Speaker
And as he points out, you can do this in one of two ways. You can ask students to write an essay arguing for increased tuition costs by paying them money to write an essay arguing for increased tuition costs. Or you can tell students, look, we've already got quite a lot of essays that argue that tuition costs would go down.
00:33:24
Speaker
And so to do a comparison class would be quite great if some of you could at least write an essay arguing tuition costs can go up.
00:33:33
Speaker
And it turns out the kind of people who end up arguing that tuition costs should go up, who aren't financially incentivized to do it, seem to provide better arguments for increased tuition costs than those who are paid to put forward arguments for increased tuition costs.
00:33:56
Speaker
So it seems that in the process of forcing yourself to argue for increased tuition costs without there being any financial incentive, you are more likely to come up with a good argument for that position than the person who's going, well, I'm going to get five bucks for this. So I can just write something about how capitalism is great. There we go. Capitalism is great. Put up tuition costs. Where's my five bucks now?
00:34:20
Speaker
It is the argument for capitalism itself, I guess. What's the point of this? Simply that our beliefs are malleable. Well, and the context you see is the ease with which we are led to self-attribute beliefs, the idea that the people who end up arguing for increased tuition costs because they had persuaded themselves,
00:34:41
Speaker
The way we lead to self-attribute beliefs by these kinds of manipulations suggests that we lack detailed internal representations of our beliefs, just as our internal representations of the visual scene lack detail and are easily swamped by changes in the external world, so long as gross features are retained. Why is that dinging away? I'll start that again.
00:35:10
Speaker
The ease with which we are led to self-attribute beliefs by these kinds of manipulation suggests we lack detailed internal representations of our beliefs. Just as our internal representations of the visual scene lack detail, and are easily swamped by changes in the external world, so long as gross features are retained, white man dressed as a construction worker, as in Simon's and Levon, then so our beliefs can be swamped by quite weak evidence that we believe something else all along.
00:35:39
Speaker
So, Neil's argument here is that people are persuading themselves through the process of writing these essays in favour of inquiry's tuition costs, that they actually now believe tuition costs should go up. Right. Okay, so where does he take it from there? Well, this all leads to the following claim in the next section, outsourcing and belief shift in the real world.

Social Groups and Belief Shifts

00:36:04
Speaker
So, he sums up. The evidence just cited is evidence that our belief states are often much less rich and much less stable states than we would have guessed. While of course we have internal belief states, they are remarkably shallow. When we ask ourselves what we believe, we look as much as the world, especially in the social world, to answer the question as to our own mental states.
00:36:26
Speaker
In the face of evidence that I believe that P, or that people like me believe P, I conclude I believe P, and I may do so even if I previously had a different belief without noticing the shift. And this is where he comes back to Lindsey Graham and conservatives. He goes, well look,
00:36:47
Speaker
Lindsey Graham used to believe that pay, Donald Trump is a bad president, he was a never-Trumper. Then he came to believe not pay, so he came to believe he was a forever-Trumper and that Trump would be the greatest president of all time. He goes back to the claims he made in previous chapters about how
00:37:06
Speaker
Conservatism appears to be a politically incoherent belief. So as argumenters, we need to be able to explain these shifts. Why does Lindsey Graham move from being a never-trumper to a forever-trumper? How do conservatives explain their contradictory or irrational beliefs? Now, we kind of talked about that in the last episode.
00:37:28
Speaker
and that Neil has a rather primitive view of what it is to be a conservative and that you can unpack the idea of being fiscally conservative, socially liberal, or socially conservative, fiscally liberal in a variety of different ways that actually do make sense according to conservative ideology. Neil takes it that actually it's just an incoherent thesis
00:37:55
Speaker
I think that even though I'm not a conservative and do not want to know any conservatives, I don't think it's an inherently incoherent thesis. I just think it's not a thesis which can be operationalized in a way that actually makes any sense politically. Anyway, politics aside, he's saying we need to be able to explain why these people change their beliefs. And this
00:38:23
Speaker
This is where I think Tucker Carlson comes in, because it is quite possible that Lindsey Graham never changed his beliefs at all. He simply changed what he was going to say in public about his beliefs.

Lindsey Graham: Sincerity or Signaling?

00:38:38
Speaker
And the reason why Tucker Carlson is important to bring out here is that for a very long time,
00:38:44
Speaker
Tucker Carlson was one of the biggest Trump boosters on Fox TV you could possibly have. So there was a period of time in our lifetimes where Tucker Carlson was the most popular Western media commentator in the world and the best paid Western media commentator on the world with his Tucker Carlson Tonight Show.
00:39:07
Speaker
And on that show, he talked about how great Donald Trump was all the time. He was really, really into Trump. He was very pro-Trump.
00:39:17
Speaker
And then when Fox got sued for all of the, you know, that racism and stuff that was going on behind the scenes, the transcripts of Tucker Carlson's emails and correspondence to other Fox hosts came out. And it turns out at the same time that he was on TV saying Donald Trump is great, he was privately telling his bosses and his colleagues that Donald Trump was the worst thing to ever happen to the American presidency.
00:39:45
Speaker
So on one level, Tucker Carlson was attesting that Trump was great. And on another level, he was in private telling people that Trump was awful. Because just because you attest to a belief publicly doesn't mean that you sincerely believe the claim of that belief, the propositional content. Sometimes you are simply signaling to other people in your social group, your class, your cast, etc.
00:40:15
Speaker
that it's better that our person wins or our people are in charge. And Neil basically doesn't delve into any of the social signaling stuff. We don't need to explain all of these so-called bad or irrational beliefs with respect to people sincerely believing them and then needing to explain how they want sincerely believed pay and now sincerely believe not pay.
00:40:44
Speaker
In some situations, and possibly many situations, people have always believed in not pee, and just attested to pee because it's a pragmatically or politically useful belief to adhere to, or vice versa.
00:40:58
Speaker
And so Neil wants to explain all of these bad beliefs with this kind of epistemic framework, when actually it's not clear that all of these beliefs are in need of that epistemic framework. Now, he could respond by simply saying, oh, I'm only interested in a very particular kind of bad belief, the sincerely held bad belief.
00:41:21
Speaker
which shows that people believe to pee at one time, believe not pee at another time, and we need to explain how that shift occurs. But as he gives us no mechanism and doesn't even talk to any real degree about the idea of how we can distinguish between sincere belief and pragmatic belief, it's not clear how we can apply his theory in the light of rival theories that already explain the same kind of belief shift.
00:41:51
Speaker
Yes, I mean, so much of this can just be written off as hypocrisy. Yeah, and there's things I agree with most of the stuff he argues for, for how shallow our beliefs are, how we outsource our beliefs, how we are very reliant on the beliefs of other people. I've got no qualms with his survey of the literature.
00:42:14
Speaker
I just don't think he's treating seriously the possibility that actually hypocrisy explains an awful lot of this. Or at least enough of it that his theory can't be as general as he makes it out to be. Yes, now listening to this, did you ever listen to the podcast In The Dark?
00:42:35
Speaker
Oh, true crime? No, it's one of the true crime ones they didn't listen to. Yeah, season one, they look at the murder of a young boy, which led to a whole bunch of law changes in the States and stuff. And a large part of it was about just the
00:42:51
Speaker
general incompetence of sheriff's departments in the States. And it got complicated when there was a major development in the 20-year-old case, the 30-year-old case that they were talking about that happened just as they were releasing. But anyway, it was really interesting. Season two, I got about four or five episodes into and just gave up on it because it was all about this guy, a black guy, had been accused of multiple murder. And despite the case having been appealed and successfully appealed, he'd been retried like half a dozen times.

Racism in Trials and Hypocrisy in Crime Podcasts

00:43:22
Speaker
And you're listening through this thing and saying, why do they constantly go back to this guy, even though the case seems to be so weird and keeps getting thrown out? And they're just going, yeah, it's racism. No, it's racism. Yeah, yeah, yeah, I hear, yeah, yeah, yeah, we're not racism.
00:43:36
Speaker
And I don't know, I just had the same experience listening to a lot of that saying, you know, it's hypocrisy. Like I can see you're building a bit of a case there, but you know, it's just hypocrisy. Yeah. Yeah. And he doesn't take that threat to his argument seriously. And I think it's weird that it's not talked about. And that is the end of chapter three. Well, I mean, it's not because actually I kind of stopped
00:44:05
Speaker
two-thirds the way through because the rest of chapter three is just more case studies to try and show that his theory applies and their case studies you go, yeah, but you could just do signaling to explain that or hypocrisy.
00:44:20
Speaker
And at which point I was going, I'd just be beating a dead horse by this point. So let's now move on to chapter four. Shall we say, and that is the end. We need to dare to think. We need to dare to think. It's what we choose to say about chapter three and chapter four. Yeah, chapter four is dare to think, question mark. I don't know why it's called that. I don't, I don't remember seeing any call back to the actual title. Now this chapter, I've written a bunch of notes about it. I will see how much of them I end up actually using because it's,
00:44:48
Speaker
It basically just seems to be a treatment of virtue epistemology. I made a note early on that one of the sections felt like one of those Reviewer B things that we see in the papers from time to time where there's this little bit stuck on. You wonder why is that there? And then you point out it's probably because one of the reviewers said, hey, what about this? And so they had to stick in a few paragraphs to shut up Reviewer B.
00:45:17
Speaker
Initially it sounded like it felt like that. Like someone had said, what about virtue of epistemology? He said, okay, I'll stick in a bit to say about that. But then it ends up being the whole chapter. So I guess he does have a greater motivation to talk about it. But he begins his chapter, Dear to Think, by saying,
00:45:36
Speaker
In the last two chapters, I argue that beliefs are pervasively outsourced to other people and to the environment, and that belief revision often occurs in response to changes in the cues that scaffold our beliefs. I've suggested that in light of these facts, we need to ensure the scaffolding of better beliefs. We need, that is, to manage the epistemic environment. Bad beliefs are produced by a faulty environment, and better beliefs are best promoted by environmental engineering.
00:46:00
Speaker
and then goes on to sum up what's going to be happening by saying, in this chapter in the next, I'll focus on the prospects for cognition in the absence of environmental engineering and other kinds of scaffolding, given that this sort of scaffolding that he's going to argue we need to set up doesn't exist yet. He says, I'll argue that individual cognition unaided is very much less powerful than we tend to think. Worse, if we refuse to aid the individual cognizer, others will be all too eager to fill the gap, often in ways that leave them and us worse off.
00:46:29
Speaker
He does point out, though, there is a view that, quote, rationality is a scarce resource and that, for the most part, we respond relatively unthinkingly. But he says he isn't going to be endorsing that particular view. He says one of the aims of this book is to argue that we are, in fact, rational animals. If epistemic engineering is justified, it's not because we're not rational enough to respond to epistemic challenges. It's because we're never sufficiently rational on our own.
00:46:58
Speaker
again bringing up this idea that the group cognition is something we need to rely on more, or that we can't rely on individual cognition exclusively, we have to rely on group cognition as well.
00:47:13
Speaker
So here's this little, the next little section is called Regulus of Epistemology. He starts by saying that there are, you know, we could bring up countless examples of individual reasoning going wrong. We could bring up plenty of examples of individual reason going right as well, counting it up like that won't need much.
00:47:30
Speaker
So he decides he wants to look at the best possible case for individual reasoning, because obviously if you can find fault with that, then that's more conclusive. He says, I'll focus here on what might reasonably be taken to be individual reasoning given its best shot. I'll focus on individual reasoning in the form recommended by leading contemporary epistemologists who explicitly aim to develop practical guides to good reasoning. I'll focus, that is, on reasoning as recommended by regulative epistemology.
00:47:57
Speaker
That's not a term I was familiar with, but fortunately he gives a definition of it pretty much straight away. He says Regula to the epistemology offers us news we can use, practical precepts and advice on how to think better in the pursuit of knowledge. As it's been developed to date, Regula to epistemology is individualistic.
00:48:13
Speaker
It's addressed to individual thinkers, offering them advice on how to gather and collate evidence, how to weigh it, how to avoid error and what warning signs on the path to knowledge look out for. Regulative epistemology has been developed most fully by virtue and vice epistemologists.
00:48:28
Speaker
A little bit later it says that virtual epistemology in its regular form may have a role to play in guiding us toward better belief, but I will suggest that its role is extremely limited, which indeed pretty much is a theme for this chapter. So virtual visor epistemology, it's come up before. I forget exactly which papers we've looked at, but I know it is a topic we've discussed before, isn't it?
00:48:49
Speaker
It is, yes. So virtue and vice-epistemology came out of the development of virtue-epistemology. So virtue ethics is, how can we inculcate the right virtues to be ethical or moral people? And then people went, actually they're probably also epistemic virtues. So what
00:49:07
Speaker
Epistemic virtues can we inculcate to make people better thinkers or able to deal with knowledge claims within the world. And as Libby points out, it's quite rightly thought of as a kind of regulative epistemology. So prior to the middle of the 20th century and basically the work of Gettier with his justified true belief knowledge,
00:49:32
Speaker
most epistemologists were offering practical advice. How can you gain more knowledge about the world? After Gettier's paper, epistemology started looking inwards and started getting very focused on what do we mean by justification? What do we mean by truth? And offering less advice to people. How do you actually gain knowledge? How do you sustain it over time?
00:49:54
Speaker
But virtue epistemology is regulative by its nature, in that if you tell people these are the virtues you need to be a good thinker, then you're also telling people how they should think in the world.
00:50:09
Speaker
And as people were doing virtue epistemology, people are going, you know, there are, there are also vices. I mean, there are vices in ethics that lead towards, you know, bad moral paths, presumably are also epistemic vices, which actually make it hard to gain knowledge. And so there's been a renewed interest, not just in what virtues do you need to be a good thinker, but what vices might be inhibiting
00:50:36
Speaker
people or classes of people from gaining knowledge about the world.
00:50:41
Speaker
So basically, this is a few, it's all about, as you say, inculcating, epistemic virtues in us that can be directly by sort of education, educating people into these virtues and the fact that you should follow them, or simply by guiding our inquiry so as to push us towards virtuousness. So if not saying, here's a virtue, do it, and then it's, here's how to conduct your inquiry in the way that a virtuous person would.
00:51:10
Speaker
But so having introduced this view and defined it and talked about it, Levy pretty much writes it off straight away. He says, I doubt that virtue epistemology in its regulative form will be especially helpful to us. He says, I'll suggest that the virtues are not especially valuable as a means of regulating our epistemic conduct in the surface of the acquisition of knowledge. Perhaps the virtues conduce to knowledge, but they do so to a very limited extent, and then only in appropriately structured epistemic environments.
00:51:37
Speaker
inculcation of the virtues is helpful, if it is helpful at all, not as an alternative to scaffolding inquiry and restructuring the environment, but only in conjunction with these mess measures.
00:51:48
Speaker
Now, when I read the section of the chapter, I thought that was going to be the end of the description. I did, yes, that's exactly what I thought. And yet it then spends another 20 pages going through it. Now, admittedly, there's some really interesting examples in those 20 pages, but at the same time, you don't get any further evidence that virtue epistemology isn't going to do the job it's going to. You could have just ended it there and moved on to the next chapter.
00:52:16
Speaker
Yeah, I thought that was the end of this bit. This is the point I said, that sounded like a Reviewer Bee thing. He's brought up virtue epistemology and then said, no, I don't think it's going to be much good. In fact, I guess I misread it. He says, I'll suggest that the virtues are not especially valuable and so on and so forth. I thought that was the statement he was making, but no, that's his statement of intent for what's going to happen in the rest of the chapter.
00:52:43
Speaker
The next section is called inculcating the virtues. He says virtue epistemology has the same kinds of problems as virtue ethics. They fail to be action or conduct guiding. They're very big on the theory, but difficult to pin down to how exactly how one should act.
00:53:06
Speaker
There's the same idea, like virtue ethics, you might be more filly with virtue ethics, I guess goes back to the ancient Greeks one way or another. And it's all grounded in this idea of practical wisdom is what you need to acquire the virtues. But even back then, there was always a bit of a
00:53:22
Speaker
a paradox of sort of you know you need practicalism to get the virtues but you need the virtues to know what the practical wisdom is it's all it gets a little bit strange and vague apparently it is a classic i note when i see it yes yeah yeah
00:53:37
Speaker
So apparently, virtue epistemologists and also virtue ethicists will respond by saying, quote, the wish for anything more precise to guide action and intellectual conduct is a fantasy, insisting that doing and thinking the right thing really just is a matter of difficult judgment. So when people say,
00:53:54
Speaker
You say, I need to do these things virtuously, but how do you know? It's hard to find out. Hard to tell what the virtuous thing to do. He's like, yes. Yes, it is. That's the point. So Levy himself says that he is agnostic on the extent to which virtue epistemology and its analyses are genuinely helpful. If virtue epistemology can help, however, it's not by substituting for apt deference to others and socially distributed cognition. It's just by playing a small role in helping us to do these things better.
00:54:24
Speaker
Virtua epistemologists may be able to something, but in its current guise, their explicit recommendations are far too individualistic. And so again, here we see the thing. He was wanting to argue, as he put it, their socially distributed cognition, but virtue epistemology and virtue ethics are all about inculcating these values in the individual.
00:54:44
Speaker
So that's what he's wanting to get away from. And so in the next section, he says he's going to use open-mindedness as a case study. This is being, open-mindedness is supposedly a virtue, the sort of thing that a virtue epistemologist will want to promote. So he wants to see how a virtue epistemologist would delineate the virtue of open-mindedness from a kind of intellectual flaccidity on the one hand, and from dogmatism on the other, and to provide concrete guidance for intellectual inquiry.
00:55:14
Speaker
He says right up front he doesn't think this is going to work as a central focus of an explanation of bad belief and a remedy for it, but I do appreciate the fact that he used the word flaccidity and I got to say it in a philosophical context. So this leads on to the next chapter which is simply open-mindedness as an epistemic virtue.
00:55:33
Speaker
So, as he says, everybody knows you shouldn't be too open-minded. There's all the jokes about, you know, you open your mind too much and your brain will fall out. So it is possible, and this is something, of course, that comes up in virtue ethics and virtue epistemology, the idea of these virtues being situated between two extremes. So open-mindedness is sort of in between the two extremes of like credulousness on one side and close-mindedness on the other.
00:55:59
Speaker
Now, he talks about two different views. Kripkey, that's Saul Kripkey. Have you Saul Kripkey? Died last year or the end of the year before last, I think. Apparently not very nice towards women.
00:56:13
Speaker
out of there. Well, at any rate, he argued that you aren't justified in dogmatism in certain cases. You're being justified in being dogmatic in the face of certain claims. But basically, if something is settled, if something is just a known fact and something else comes up, he gives the example of astrology, that you are justified in simply saying, nope, that's wrong. Look, I don't know enough to specifically refute your claims, but I'm just going to say they're wrong.
00:56:41
Speaker
And then we get your good friend Kasam, who argues, no, you're never justified in that. Dogmatism or close-mindedness is an epistemic vice. You should never do it. And so then we get the back and forth between the two.
00:56:54
Speaker
Now he does say, of course, Kasam accepts that the virtuous agent should be slow to abandon their justified convictions in the face of arguments they can't immediately see how to refute. We should never be dogmatic, he maintains, but we often ought to be appropriately firm in our opinions. Open-mindedness is the mean between intellectual flaccidity and dogmatism.
00:57:15
Speaker
This is, yeah, immediately it starts to get a little bit, a little bit fuzzy. So we're not saying you shouldn't be dogmatic, but you should be firm in your beliefs. How firm? But not firmly dogmatic. Not firmly dogmatic.
00:57:30
Speaker
You should just be a Ted shy of dog. You should be that kind of fitness that could almost be dogmatic, but doesn't quite go over the limits into dogmatism. Apply your practical wisdom, Josh. You'll be able to tell the difference between firmness and dogmatism.
00:57:47
Speaker
Yes, and so this comes into the question of engagement. Kripke would say, you just shouldn't engage with arguments that you know are false. Just shrug your shoulders and move on, whereas Cassanti is going to be, you should engage. He puts it in terms of confidence. It's like, well, okay, if you don't
00:58:05
Speaker
If you don't know enough to refute this view, then how confident can you be that you're right? And you should engage them with these views, even though you're quite certain that you're wrong, because in being able, in refuting them, that shows that you have the confidence, shows that you're confident in your beliefs, and therefore justified in calling these beliefs knowledge.
00:58:26
Speaker
I mean, skipping to the next section, he's going to do a case study of showing actually it's very, very hard to show how you can be firm in not just a complicated debate, but even a debate where you know an awful lot about what's going on there.
00:58:45
Speaker
Certainly if he disagrees with Kasam here, he says, I'll argue that insofar as Kasam urges us ordinary agents who lack any special expertise in the domain of the argument to tackle these arguments on our own, he's wrong. Engaging with them risks knowledge to a far greater extent than does dogmatism.
00:59:02
Speaker
And he says, though, levy reckons that the root of this disagreement is there's a disagreement over how easy it is to discover where a spurious argument goes wrong. Kripke seems to think it can be quite hard to point out why exactly an argument might be wrong, whereas Gossam seems to think, no, you can fairly, if it's something that you know is wrong, it should be easy to
00:59:22
Speaker
show why this particular spurious argument is wrong. And again, he disagrees with Kassam. He says, I argue that Kassam is wrong on the empirical question. In fact, we're at much greater risk of losing knowledge from, quote, doing our own research, unquote, than we are from dogmatism. It's true that we are often able to rebut spurious claims, but that's not by probing them for ourselves. It's by apt deference. The intellectual rituals play only the smallest of roles in any of this.
00:59:50
Speaker
And so he then goes into the next section, climate change skepticism, Holocaust denial, and other fantasies by, as the title suggests, talking about a few different case studies.
01:00:01
Speaker
So he starts by saying, one of Kassam's principal aims in developing vice epistemology is to enable us to understand the origins and the persistence of conspiracy theories and the like. He argues that epistemic vice is an important factor in explaining why people accept these theories. And I note that he has a little footnote pointing out that he is sympathetic to the idea that we should actually ditch the pejorative use of conspiracy theories, but he's going to stick with the more
01:00:26
Speaker
colloquial I guess the usage of them right now is a thing which is by definition irrational.
01:00:33
Speaker
Although it is interesting, he cites both David Coady and Charles Pigton there, as if they're making the same claim. David's claim is that we should just stop talking about conspiracy theories entirely. He thinks the term is equivalent to witch hunt, and we shouldn't call something a witch hunt because it's obviously a bad term. Therefore, we should call nothing a conspiracy theory. We should simply refer to them as, you know,
01:00:59
Speaker
explanations, when they're misinformation, misinformation, when they're disinformation, disinformation, et cetera, et cetera, was of course Charles is arguing we should ditch the pejorative meaning of conspiracy theory. So we should still talk about conspiracy theories. We just shouldn't talk about them in a pejorative sense. So I do kind of feel he's misrepresenting one of the two authors he's referencing there.
01:01:24
Speaker
At any rate, that was but a footnote, and he doesn't really talk about conspiracy theory so much later on anyway. The section restates a lot of what came before it, simply the idea that it can be hard to know exactly what's wrong with a spurious cane.
01:01:41
Speaker
especially when they rely on scientific or at least scientific claims, or at least claims which claim to be scientific, if you see what I mean, that a layperson might not know much about. And he says it's worth adding that the multidisciplinary nature of climate science, like many other areas of contemporary science, entails that many actual climate scientists may lack the skills to rebut the sophisticated skeptic. And this was something that came up in the previous chapter as well. Climate science is such a wide-ranging thing and sort of draws from so many other
01:02:09
Speaker
disciplines that it's highly unlikely that any one science or any one scientific discipline knows all that it needs to know. They have to draw in from other areas.
01:02:19
Speaker
So he brings up the case of objections to the idea of anthropogenic climate change with overwhelming. He says, it may be important that someone rebuts the claim. That will depend on its novelty. The great majority of climate scientists will outsource the job and defer to whoever does it. For the most part, this difference will itself be dogmatic. They won't search for rebuttals. Rather, they'll move on confident that if the claims are worth engaging with, someone well-placed to do so will take on the task.
01:02:47
Speaker
They'll deal with challenges they take to be worth taking seriously in their own areas instead. I like the mention of novelty there because that is something that I've talked about and we've talked about before. I think the idea that claims, there are these claims that maybe they need to be looked at or maybe they need to be rebutted, but you often in lots of areas see the same bad claims coming up time and time again.
01:03:11
Speaker
So, you know, certainly there are a whole lot of claims that have no novelty whatsoever. They've been brought up and debunked many times before, a long time in the past. So in those cases that year, there's perhaps much less of a need to rebut them than a brand new challenge to something.
01:03:28
Speaker
But, so climate change, climate change is complicated, climate change is very difficult science that draws in lots of, lots of disciplines and so on. He goes to, he says, well, can we go look at a simpler example? How about Holocaust denial? That is apparently is Cassin's preferred example for talking about these things.
01:03:46
Speaker
And he says, well, Holocaust denial, that's a question of history, essentially, and the history of the Holocaust or of Europe, you know, World War II era, European history or what have you, that may not be as specialized and as multidisciplinary a field as climate change, but you still may need a fair bit of specific expertise to rebut specific claims made by Holocaust deniers.
01:04:14
Speaker
And he goes into the idea of how easy it is to be tripped up by a lack of very specific understanding. He first talks about the thing that we've talked about before, Naomi Wolf, and her embarrassment when she was challenged on the idea that
01:04:32
Speaker
I forget which of her books it was, but she talked about the idea that the persecution of homosexuality became much greater in one period. And she uses evidence of the increasing incidence of the phrase death recorded next to the names of people who had been convicted of sodomy. And she took this to mean that a whole lot more people were getting the death penalty for homosexuality.
01:05:00
Speaker
And until a person said to her, no, actually... Not just a person, Josh. Matthew Sweet, cultural historian, and crucially, Doctor Who historian. Yes, indeed. Matthew Sweet, and he was the one who said to her, well, no, actually, that death recorded means we wrote it down that they received the death sentence, but they didn't actually get the death sentence. That specifically means they weren't put to death for this.
01:05:25
Speaker
And let me point out the thing is that her book, like she didn't just write it on it, write it at home and chuck it out there, it had been reviewed, it had been fact checked by people who he said you would think would know enough, but not, hadn't been fact checked by people who had the specific expertise needed to get that particular claim right. And given that it's one that it seems
01:05:49
Speaker
He also points out that another legal scholar also makes the same mistake. So he's going, look, it's not just Naomi Wolf made this mistake. Other legal scholars have made this mistake. Because it turns out you can be a legal scholar, but if you don't have the cultural history of the law practice at a period of time, you might not realize that this is actually an ambiguous phrase.
01:06:17
Speaker
You could argue that, given that that seems to be a claim that she was leaning on quite hard, you might want to do a bit more research. Whereas, you know, she said, I think it means this, she's talked to some other legal expert who says, yes, I think that's what it means, but neither of them actually had the specific knowledge.
01:06:40
Speaker
of that sort of thing happening. And so he says, expertise is brittle. He actually comes up with a couple of other instances of people getting things wrong, making not unreasonable assumptions, but what turned out to be incorrect assumptions because they lacked a very specific kind of expertise. And he says, expertise is brittle, referring to a forthcoming work by Kelov. I don't know who that is.
01:07:05
Speaker
But they refer to this work as, an expert in a particular domain is often unable to transfer their skills to another intuitively similar domain. For example, the idea of, for instance, surgical skills. A person who's very skilled in a particular type of procedure, those skills don't necessarily translate to a different procedure, even if it's the same, what seems like the same sort of thing, and various other things apply.
01:07:30
Speaker
He says expertise in a very specific domain may provide someone with the confidence they'll perform well in an adjacent area, but they may nevertheless lack the competence. Now, Kasam apparently acknowledges that, no, we can't all be experts in everything all the time. He says we do need to consult the experts, but then that brings it back to sort of a next order of question. Well, how do you know who's an expert? How do you know who you should be listening to without
01:07:55
Speaker
then evaluating them and how do you know that? And so, Levy again says we should be dogmatic in these cases. He says, dogmatism here involves the proper scaffolding of inquiry, relatively unquestioning difference to authoritative sources because they're authoritative and not because we've assessed their degree of expertise ourselves. It's because they have the right credentials, primarily because they represent the expert consensus view or are endorsed by duly constituted epistemic authorities,
01:08:21
Speaker
that we should defer, not because we've virtuously probed the track records or their citation indexes, let alone because we've evaluated their arguments. Now, let me point out, he's just said dogmatism, but then defined it as relatively unquestioning difference. I would say Kasam would say that's firmness. That's not dogmatism. That's firmness, yeah. That's firmness there. Now,
01:08:44
Speaker
I still think this is an awful idea of dogmatism or firmness with regard to authoritative sources just because they're authoritative, but I don't think there's as big of a difference between Levi and Kasam here as Levi wants to make out.
01:09:02
Speaker
possibly not, no. So after these examples, he goes back to talking about scientists and points out that scientists do tend to be fairly dogmatic. This is often taken against them perhaps, but it seems fairly
01:09:17
Speaker
It's not uncommon that scientists, when faced with evidence that contradicts their paradigm, they tend to disregard the evidence. Rather than saying, oh, no, this contradicts our paradigm, everything I believe is a lie, they most often say that evidence is probably wrong.
01:09:33
Speaker
I can't say why right now, but I'm going to assume it's probably wrong. And there are plenty of instances of that being the wrong attitude. I was going to point out, so on page 95, he makes several versions of this claim throughout this chapter, but this just gives you an idea of where he's placing science. If that's how scientists, our
01:09:54
Speaker
paradigms of epistemic excellence should behave. That's even more the case for the ordinary person. So he keeps on leaning on science, because it's the paradigm of epistemic excellence, even though earlier on in the book, he goes, well, you know, they're not that great.
01:10:13
Speaker
Yeah, again, there's a bit of back and forth as well. He says, well, obviously, it's sometimes a mistake. He brings up the good old example that we've talked about before of the cause of stomach ulcers. It was for a long time. People thought it was stress, and then this fella came along and said, actually, no, it's the H-pylori. Two fellas, two Australian fellas. Two fellas, was it?
01:10:33
Speaker
One of them infected himself with it, didn't he? He did? Yeah, it took away. And because this disagreed with the current consensus, they were disregarded, but eventually they won it round. But he said, even though we can point to these cases where that sort of attitude is wrong, because it's the best approach in general, he says, anomalies are cheap and plentiful. The scientist can't abandon her research whenever she hears of one. That would mean abandoning her research forever.
01:10:58
Speaker
If she's to hang on to her knowledge, she'd better be able to respond by shrugging her shoulders and setting aside the many contrarian views she hears expressed every day. And so basically in the end of it all, the moral of the story is virtue epistemology is let down by its emphasis on individual cognition. That's what he said at the start. That's where he ends up. So we need that epistemological scaffolding that he's been making the case for.
01:11:22
Speaker
And then he talks about how he's been baffled by COVID-19 responses in a section which I probably would not have included that in a book that I've written.
01:11:33
Speaker
No, it's a funny one. Again, it almost feels like a, I mean, I said the whole chapter feels like a response to Review of the, and this one particularly. So he basically says, having all this, let's have a look at something that seems like it's a counter example to what I've just been talking about, but maybe it's not. And that's in his final section, descent in a time of COVID.
01:11:55
Speaker
of COVID hit, all of a sudden everyone and their dogs an expert. Everybody suddenly now has opinions on epidemiology and the economic and psychological effects of governmental responses and so on and so forth. I became an epidemiologist during the COVID-19 pandemic because it was reported on stuff that University of Waikato epidemiologist MRX Dent have had the following to say. So I am now a
01:12:23
Speaker
Prudentialed expert on epidemiology, according to the kind of account that Livy wants to run. Was that just a poor assumption on the apartheid, or did they mishear epistemologists? I think they wrote epistemologists down, and they got a little squiggly with it. That's not a real word. Let's use also correct there. Oh, epidemiologists, no, that's a real word. That's a real profession.
01:12:50
Speaker
So he sort of said we have the situation where the thing that seemed to be the right approach was to listen to the scientists. Obviously not everybody did this or did or did not do it to a greater or lesser degree. But the science seemed to say the best thing we need is we need lockdowns, we need social distancing. That's the way we're going to stamp out this disease. But then others bring up, well, OK, but there are definitely going to be economic costs to doing this.
01:13:19
Speaker
There are possibly going to be psychological costs to this, the people who deal poorly with the sort of sense of isolation and what have you. As he points out, these are all things to be thought about. It's not necessarily that the economic ones cancel out the scientific claims, but they're all things we have to listen to. And who do we listen to?
01:13:38
Speaker
It especially doesn't help that if we're talking about the, this is all speculative at the time. You don't know how well things are going to work or you don't know what the possible adverse effects are something you're going to be, that no one's ever done before like this until after you've done them.
01:13:55
Speaker
He suggested a bit of the goalkeeper's fallacy was at play here. Apparently it's known that when you're taking a penalty or just kicking a goal in soccer in general, going for the middle, shooting straight into the middle of the goal tends to score more often than going for the edges. I think this must be talking about a penalty because otherwise the goalkeeper, if you're running up on it, the goalkeeper has probably got a better idea of where you're going to go. But the point is that goalkeepers would rather be seen to dive left or right and miss
01:14:24
Speaker
than to anticipate a wall's gonna come into the middle, just stand there in the middle and then miss. It looks worse to seemingly do nothing than to be seen to do something even if you fail. And so he sort of said a lot of, possibly a lot of, a lot of governmental responses were driven by this. It's better to, it looks better to try something that may seem a little bit extreme than to do nothing, even if you think maybe that's a better response.
01:14:53
Speaker
So he sort of says, is this account an example to what he's just been talking about? The sort of difference to experts that he was advocating just a few paragraphs earlier seems to be less justified because there's a lot more
01:15:10
Speaker
uncertainty a lot more, a much greater spread of competing opinions from competing disciplines. There's the whole scientific consensus, but then there's aspects of the scientific, epidemiological sort of consensus, but then there's the psychological factors, the economical factors, all completely different disciplines entirely.
01:15:35
Speaker
And he basically says, is this a counterexample? No, because it's not the same. It's a bit different. He says that we're now having to weigh up factors from completely different disciplines, and that's much harder to do, and there basically aren't.
01:15:55
Speaker
the authorities to let us make that decision in between. And also in particular, the COVID-19 pandemic, this book was from what, two years ago, was it? Was it 2022? I actually have it right in front of me, don't I? I can look it up. So this is much closer in time to COVID than we are right now. And you see at the time he's writing, the pandemic was a very new thing, whereas comparing it to climate science, which he just said,
01:16:22
Speaker
It's this big, complicated, interdisciplinary thing, but we should nevertheless defer to what the scientists say. Climate science has been researched for decades, whereas the pandemic was also very useful. It has basically seemed to be that it's too many unknowns to be, or rather there are enough
01:16:43
Speaker
unknowns and uncertainty to justify a lack or less difference towards people who were put themselves towards its experts in this case. He finishes up by saying, all that said, I'm skeptical that the pandemic is a case in which any of us does better epistemically by making up our own minds. Individual cognition is limited and biased for reasons that are by now familiar.
01:17:04
Speaker
At this point in the development of knowledge we may appropriately contribute to the establishment of a consensus through stress testing, but for each of us it's very unlikely that our considered view is better than that of the epidemiologists advising governments, say. Even in this case, and in the absence of a justified consensus, almost all of us probably do better by deferring than by dissenting, though here the state of knowledge as a whole may benefit if we dissent.
01:17:28
Speaker
And it's the end of the chapter. And yes, like we said at the start, I kind of, he basically stated the view that he went forward with rightly at the start of it. And this whole chapter has just sort of been going over it again and coming up with examples and restating things. But the whole thing basically seemed to be virtue of histology. No, and not much more than that.
01:17:53
Speaker
Yeah, it is a very weird chapter, and it's a very weird ending, because it's essentially saying we should defer to experts, but we don't know who the experts are. We should defer to the sediment experts. We don't know the actual experts, but we might as well defer, but descent might be good. But we should defer to experts.
01:18:16
Speaker
Unless we shouldn't. Yeah. Which seems to be the entire tenor of the book. We should do it. Yes. Except when the examples don't work in my favour and then we shouldn't do X at all.
01:18:26
Speaker
Yes, it seems increasingly fuzzy, but we have two more chapters to go, which we'll be talking about next episode. Is this like the other one where there's a concluding thoughts section after chapter six as well, but that's only short, so I guess it is. Right, so that's where we leave things with Neil Levy's bad beliefs.
01:18:50
Speaker
and tune in next time for the thrilling conclusion of the trilogy. Yes, in which we find out... Return of the Jedi, if you will, or the Die Hard with the Vengeance. Well, we find out that the butler really did have those bad beliefs the entire time. Well, the psycho... Do you remember when it was the time where it was always the psychologist who was the bad guy? Around sort of the 90s, any time there was any sort of a thriller, if there was a psychologist involved, it always turned out to be the bad guy. I assume there was some sort of a... it was just a Scientology...
01:19:18
Speaker
influence, I don't know. I mean, maybe, maybe. I mean, there was that kind of cliche of psychologists being cold and calculating, which might well have been Scientology trying to get its hooks into Hollywood. I mean, I'm quite, I mean, having walked along
01:19:37
Speaker
Hollywood Boulevard and seeing the big Scientology TV and movie production studio, which as far as I know, largely just makes infomercials for Within the Church. I wouldn't be surprised if they're trying to have their influence on Hollywood films.
01:19:57
Speaker
So if it hadn't have been for the trope of the evil psychologist, we would not have got the ending to the 1993 Drew Barrymore film Doppelganger, which I think you'll agree is the most berserk ending of any film that's ever been made. I mean it has to be up there. I'm just going up there.
01:20:15
Speaker
There might be more bizarre... There are certainly more strange and bizarre endings, but just the shift, just the going from the rest of the film to that ending.
01:20:29
Speaker
It's true. It is a, it is tonal whiplash, tonal whiplash. I, I would, if you have not seen the 1993 film Doppler Gang with Drew Barrymore. This is not the first time we've discussed this in a podcast. No, it's not. You should watch it. It needs to be experienced. That's all. I'm not saying you'll like it. I'm saying you need to see it. Yeah. It's a little bit like watching. It's worth watching Orphan just to see the ending of Orphan and going, Oh,
01:20:59
Speaker
That's what she is. And then watching Orphan 2, a prequel, and going, you've actually done a really good job of making this work, given there's a good 10 years. And it was, in retrospect, it was hard to imagine her playing that part the first time, but now she's 10 years older and playing the same character in a prequel. Very difficult. Very strange. And apparently they want to make a third one.
01:21:27
Speaker
Of course they do. If the previous one made money, they can't stop themselves. So, that'll do. That'll do for now, I think. Although, of course, it won't because we have to go and record a bonus episode. We don't have to. We're going to because we love our patients. We choose to because they deserve it. Yes. Yeah, they do. We have to, in a sense, I think we have a moral obligation.
01:21:48
Speaker
And we do precious little elves for our patrons. Well, exactly, yes. But what we do do is do to. It's something or something. Yeah. Definitely something. It is definitely something.
01:22:02
Speaker
If you want to know what that something is, there'll be a bit of Trump, there'll be a bit of other stuff who can say, you'll need to be one of our patrons, which you can just do that. You can just go to Betrayon.com and look for the Podcaster's Guide to the Conspiracy and sign yourself up. And it has been proven in a laboratory that becoming a patron of the Podcaster's Guide to the Conspiracy
01:22:23
Speaker
increases your happiness by 2.3%, which in today's economy is a massive change. That's a bargain. Yeah, 2.3%.
01:22:35
Speaker
no money back guarantee no none whatsoever so to our patrons stick around we'll have a bonus episode for you unless you're some sort of weird freak who listens to the bonus episode first although to be fair the bonus episodes are shorter and so the person who's the episode is editing that may well get it up soon when i'm in charge of editing the patron bonus episode i always wait for the main episode to go up because otherwise it makes no sense
01:22:59
Speaker
I sometimes do. But anyway, well, that's you this week. That's just because you're a man with children unless you have no patience. That's entirely true. So, that is all. That is the end of this episode. This is ending right now. That is what's happening. Goodbye. Goodbye! The Podcasted Guide to the Conspiracy stars Josh Addison and myself.
01:23:27
Speaker
Associate Professor, M.R.X. Denton. Our show's cons... sorry, producers are Tom and Philip, plus another mysterious anonymous donor. You can contact Josh and myself at podcastconspiracyatgmail.com and please do consider joining our Patreon.
01:24:01
Speaker
And remember, keep watching the skis.