Introduction and Rebranding to Warhammer 40k Theme
00:00:00
Speaker
This week we are going to get religious. Yes, this week we are all about the Omnissire. Sorry, we are we Warhammer 40k podcast now? what to do ah The The Omnissire is a concept from Warhammer forty k He, and it isn't he, is what the tech priests of Miles believe the emperor of all humanity is, the embodiment of all knowledge, the machine god.
Silicon Valley and the Singularity: Fear and Conspiracy
00:00:25
Speaker
today we're talking about how Silicon Valley fears that an event called the Singularity will occur, leading to the advent of a superhuman non-human machine intelligence that could prove to save or destroy human. Hang on, this does sound like a Swahammer 40k plot point. Ah, but our twist is that a New Zealander is involved in all of this.
00:00:42
Speaker
Yes, and the origin of this New Zealander is itself a bit of a conspiracy, which means all of this is perfect for our rebranded 40k podcast. For the Emperor. For the Omni's ire.
Hosts' Background and Warhammer 40k Evolution
00:00:59
Speaker
The Podcaster's Guide to the Conspiracy, brought to you today by Josh Edison and Ian Denteth.
00:01:19
Speaker
Hello and welcome to the Podcaster's Guide to the Conspiracy in Auckland, New Zealand. i am Josh Addison and Zhuhai, China, they are Dr. MRX Denteth. We're not a Warhammer 40,000 podcast, but we probably could be if we wanted to. I haven't played Warhammer 40k in a very long time, but I still remain quite interested in the fiction.
00:01:40
Speaker
and the way the fiction keeps on changing. For it is not it is not a consistent fiction. Oh, no. It is a fiction constant revolution and retconery.
00:01:51
Speaker
Yes, yes. the i I play the games. been playing a bit of Space Marine 2 lately. but my my experience of playing actual Warhammer 40,000 on the tabletop is limited to watching my brother and his friends play it when we were teenagers, which in my experience involved about one or two hours worth of setting up, followed by one person making a move, their opponent saying, wait a minute, you're not allowed to do that, and the entire rest of the afternoon being spent arguing over the rules.
00:02:19
Speaker
I assume that's how it's played normally, but um that's all I've ever seen. I used to field an army of squats, and I'll just leave that there. Very good. But yes, i mean it's it's as as we will see, that is going to get slightly 40,000-ish as we talk about this week's topic, which is a topical topic.
00:02:41
Speaker
And we should probably play a chime and get straight into it. Oh,
Airport Anecdote: A Humorous Travel Misadventure
00:02:44
Speaker
actually, sorry, you've you've been up to stuff this week. You've been off to Beijing. You've done weird things. Do you have anything to report? Or has it just been admin-y, the life of a jet-setting academic stuff?
00:02:56
Speaker
I spent nine hours in an airport. Haven't we all? Yeah, but I wasn't meant to spend nine hours in an airport because the university that invited me, someone forgot to actually book the tickets.
00:03:09
Speaker
Ah. So I turn up at the airport, hand over my passport. There is no ticket for you. i go, oh. I make contact with the organiser who goes, well, I was told the tickets were booked. And then goes, oh, I've just found out from the administrators that they told me a lie.
00:03:25
Speaker
And then they tried to book me onto a later flight and that later flight got delayed twice and we had three gate changes. it's been a lot of time in an airport, a lot of time in an airport.
00:03:39
Speaker
Did anything good come of it in the end? Well, it went from I am angry and frustrated by this process to by the end, actually, it's quite amusing because they keep on delaying the flight and they keep changing the gate.
00:03:51
Speaker
Can they do it a fourth time? Apparently not, but I was willing to place bets. Okay, well well, we'll leave you and your travel woes to one side then and talk about Teskriel.
00:04:04
Speaker
With Peter Thiel.
Introduction to 'Tescriel': Silicon Valley Ideologies
00:04:12
Speaker
It's true because of rhymes. Everything that rhymes is true. So, yes, we're going to talk about Peter Thiel. We're going to talk about Tescriel. I feel there's a Doctor Who book in there somewhere, but I don't quite have it in me, so I'm just going to say what these words mean. um Tescriel.
00:04:32
Speaker
Tescriel. It's a word. ah there's ah A person made it up and and and and it has an agreed-upon usage, and that makes it a word now. Now, I just want to interject here that I realise that neologisms are important. Languages need to evolve. But there are points in time you have to ask, did you need to neologise that?
00:04:54
Speaker
Well, I don't know. Certainly computer scientist Timnit Gebru and philosopher Emil P. Torres... thought this was a worthy addition to the English lexicon, um because it is an acronym, and it describes what what what they saw as as sort of the the bunch of sort of beliefs that are percolating around Silicon Valley and and its attendant hangers-on, and presumably by looking at these particular beliefs, we can maybe maybe maybe tell a lot about the world as it currently is.
00:05:29
Speaker
But I guess we should not beat around the bush any further and say, what the hell does Tescreal mean? Well, it's an acronym. What's it an acronym for? Well, it's an acronym for transhumanism, that's the T, extropianism, which is the E, singularitarianism, which is the S, cosmonism, which is the C, rationalist, or rationalism, I guess, which is the R, effective altruism, which is the E, and long-termism as one word, which is the L.
00:05:57
Speaker
Yes, a little bit cheeky little little bit cheekier effective altruism to get two letters to itself, but i guess that's fine. So these are a bunch of a bunch of different, what would you call them, ideologies, I suppose. also All right, so some are ideological positions.
00:06:12
Speaker
Some are... legitimate philosophical positions, legitimate in the sense that they are debated in the academic literature, people publish on them, people publish replies upon them. Whether you take these positions seriously as philosophical positions that should be discussed in decent society is another matter entirely. it's a complex of some things which are well accepted in academic debate and other things which are often kind of touched with long poles. Yes, and while these are these are seven different ideologies or whatever you want to call them lumped together,
00:06:56
Speaker
i don't I don't think it's the case. I don't think anyone says that these are all all things that people believe. There is no one person who is all of these things at once, most likely, especially because some of them actually come into conflict with others.
00:07:10
Speaker
um But it does sort of seem to be, if if you lump all of these together, you've probably you've probably managed to capture a lot of the thinking of this general group. It's a family resemblance thing. So yeah there are lots of people with the same kind of view out there.
00:07:29
Speaker
And the family resemblance is that they have... three or more of the following ideologies or philosophical positions which overlap with one another, even though they have a bunch of other ones which may be in conflict with views that other people have. There's a kind of core set, and it's amorphous, which is why it's a family resemblance concept here.
00:07:53
Speaker
But when you find these people, you well, you've probably got some of the following five, and maybe you're opposed to the following two. Yeah, um there's definitely a lot of overlap. Certainly the the first four, transhumanism, extropianism, singularitarianism and cosmism, are all kind of different flavors of the same thing as we'll as we'll see in a minute. And then numb rationalism, effective altruism and long-termism, there are definitely people who are into all three of those at once. They they they do seem to go together
00:08:27
Speaker
And I think possibly the common, well, an important common factor in this day and age is I think it's fair to say that all of them have some fairly strong
Satire on Modern Rationalists and Bayesian Logic
00:08:35
Speaker
opinions on AI. Yes, and that's going to be kind of the big thing here because as we're going to talk about, Silicon Valley has discovered God and Silicon Valley has discovered the machine God and Silicon Valley is very scared the machine God is going to be vindictive.
00:08:55
Speaker
Indeed. But before we get to the machine god, um let's let's go let's look at all all all seven of those in a little bit more detail. So um off the top, transhumanism, it's it's been around a while, hasn't it? Well, yeah, the idea of modifying human beings through technological augments is something which, I mean, has been going on for a very, very long time, and huge human history.
00:09:22
Speaker
It's become accelerated in recent years, but transhumanism is actually a fairly old position. Yeah, and I mean, it needn't even involve sort of technological augmentation. I think eugenics can come under the umbrella of transhumanism. If you're going to, in and i'm I'm using air quotes here, we are an audible podcast, so therefore you're not here. I cannot make a sound for air quotes. I mean, i can I can do it as close to the mic as possible, but it doesn't really come through.
00:09:51
Speaker
Yeah, the idea of using... manipulating people's genetics through breeding programs is a form of transhumanism. And unfortunately, we saw where that went in the middle of the last century.
00:10:05
Speaker
Yes, yes, those damn Nazis came along and spoiled everybody's fun. Here we were. just having Actually, it was spoiling people's fun before the Nazis. It's just the Nazis were the most evident spoiling of fun we saw in that century.
00:10:20
Speaker
Yes, yes, no, no. and There were more than one country in the world where people were getting sterilized against their will in the name of eugenics, and the Nazis just kind of showed the yeah the logical end point, or the the logical worst end point of that, and everybody oh, ooh, okay, yes, maybe we should into that. Maybe, maybe not. Maybe not, Mr. Hitler, maybe not. yes So transhumanism...
00:10:43
Speaker
In my experience, as I have seen, seems to be the the more more of the umbrella term, just about, as you say, quote-unquote, improving the human race by transforming it in some way, whether that be changing the genetic makeup of the human race by eugenics or turning us at all into cyborgs or uploading us all into computers or what have you. That's transhumanism. And the next ones seem to be just sort of variations on that theme. Extropianism is not one I had encountered before,
00:11:13
Speaker
But the the the summary of it that I read, at least and and I had a quick flip through the the original paper by um Gebru and Torres, where they they lay this out. They talk about extravianism as being sort of transhumanism, but a bit more libertarian, a bit more sort of emphasis on rational thinking, apparently a bit more emphasis on optimism for the future. Certainly some of these things tend to get a little bit pessimistic when they talk about machine gods coming to life and punishing us all.
00:11:41
Speaker
it but It comes out of, I think it was the 60s or 70s with the writings of one particular person who invented this term, extropy, which is meant to be the opposite of entropy. They're trying to promote this extropy. I assume they've heard of the second law of thermodynamics. I don't quite know how that fits into fighting against entropy. They've heard of it, but they don't agree with it. Don't agree with it, yeah.
00:12:01
Speaker
So, yeah, to be honest, I don't know the fine details of extropianism, but it's transhumanism, but a little bit different. Now, um... singular Singularitarianism. I practiced saying singularitarianism multiple times in preparation for this podcast. It's way too new symbols for a single ideology. It does have a kind of ending sequence which is very difficult to land. Especially when you write it. It's singularitary...
00:12:27
Speaker
enism the Larry Terry you I find you have to just shut your eyes and go for it if you actually try to read the word you'll get you'll get um tripped up for without a doubt but it's basically truen if we were a comedy podcast Larry Terry-nism would be our new comic character and oh I'm Larry Terry-ism Larry Terry sounds a bit Welsh yes ah But buts singularitarianism, transhumanism that emphasizes the coming of the technological singularity. Now, I don't know. i heard about the singularity, i don't know, early 2000s, I guess, which and at the time I heard it described as sort of the rapture for nerds. um I don't know if it's evolved much further. in the intervening years. but the The initial, the way I heard of it to begin with was this, they sort of laid out this timeline of, you know, right now we can, we're we're we're about at the level where we can make sort of prosthetics that can pretty much replace the function of certain, like, you know, lot a lot of lot of prosthetic limbs are still relatively simple, but you can actually get quite fancy robotic ones.
00:13:33
Speaker
And then the idea is that, you know, the progression that will be to we can produce prosthetics that can do better then human's body parts, which will then eventually see a rise in people voluntarily removing bits of them and getting them replaced by prosthetics. That'll move on to mental prosthetics, replacing bits of your brain with computers to give yourself improved brain function, which will eventually result in the human race all converting to to basically robots and uploading their minds into computers. the That will then spread to... um
00:14:05
Speaker
the the the drive will be sort of to to hoover up all the matter on Earth to turn it into computers, to to to explain just great you go to expand expand the singularity, which will then eventually, once faster than light travel is licked, result in hoovering up the and all matter in the entire universe. Didn't that happen in Lex? Yeah. Isn't that what happened? It did, yes. It was. End of season two. Yeah. And so eventually's ah but essentially the entire universe will eventually be this one single being can consisting of all of the matter in the universe that's been converted to
00:14:43
Speaker
processes of some kind or another, and it will be the entire human race as a single being that humanity will have become God. And that that was the story of singularity of the singularity that I originally heard. I don't know if it's much different from that. I don't know something can as that. Well, now the worry about singularity... is it's going to occur without us. So the worry now is not that humans will modify themselves into the singularity and the creation of a kind of zeitgeist or omega entity. The worry now is that AI is going to speed run the singularity. So the LLMs will become...
00:15:22
Speaker
AGI, AGI will go through the singularity. They, as the machine and entities, will then speed run to the grey goo, etc., etc., and we will get left behind.
00:15:36
Speaker
So we may not be part of the singularity anymore, and that's why Silicon Valley is afraid of the machine god, because they're afraid that it's not going to take us with it. It might decide to wipe us out instead. Right, okay, that does make a bit more sense.
00:15:53
Speaker
um But moving down the list to Cosmosm, that seems to be basically putting emphasis on the second half of that singularity story that I told. It seems to place a lot of emphasis on this idea that we will end up becoming a mortal nanobot energy being AI things and And spread out to explore the cosmos, it seems to place the emphasis on that, on sort of that end point of the technological singularity.
00:16:21
Speaker
There is also a slightly more, I would say slightly more plausible, slightly less wacky version of this, which is the humanity needs to get off the planet Earth and spread to the stars in order to maintain its survival.
00:16:35
Speaker
So cosmonism in the sense that we shouldn't be restricted to one planet in one solar system because we know what happened to the dinosaurs. Yes.
00:16:45
Speaker
Yes, that probably should be said. The term cosmism, I believe, has been used for a variety of different beliefs over time. This is sort of the latest meaning of it. But if you go looking up cosmism, you'll possibly find other more sort of esoteric, arcane-y type things. But that's not that's not this. Yeah.
00:17:02
Speaker
so yeah So as you can see, those four all kind of go together. They're all talking about largely the same thing, just possibly placing different emphasis on different bits of it. But then we get into the into the last three. So you have the rationalists.
00:17:16
Speaker
Rationalism, modern rationalism, in this particular sense, they're talking about the movement exemplified by the the less wrong forms, which was it? Elijah Yudkowsky, is he the originator of less wrong or is he the... don't know, actually.
00:17:33
Speaker
It's a name associated with that group, but I don't know whether it's the originating name. Yeah, so but certainly Elijah Yudkowsky, yeah, him and... sorry If I were a proper podcaster, I'd have looked this up beforehand instead of looking it up as we speak. But yes, he and Robin Hanson were the two main contributors to the Less Wrong.
00:17:54
Speaker
um And Lacey Jokowski, he's an interesting fellow. um He is famous, among other things, for writing Harry Potter and the Principles of Rationality.
00:18:09
Speaker
which is a Harry Potter fan fiction that is longer than all books of the Lord of the Rings put together, with with the Silmarillion and Hobbit on top, I think.
00:18:20
Speaker
um It's this mammoth work of of Harry Potter fan fiction in which it's an alternate universe where Harry Potter is instead raised by scientists instead of being raised by the Dursleys and grows up to be the most insufferable little shit you've ever imagined um who who believes he's smarter than everyone else in the universe and then gets and gets gets invited off to Hogwarts and starts applying scientific principles to all of this magical stuff that he starts encountering. I've i've encountered small bits of it, and really, you're either into it, or you think it's the most must punchable characters you've ever read in your entire life. I remember being pointed to this back when it was first coming out, back before we realised how problematic J.K. Rowling is, both as a person and also as a writer of prose. You don't really want to get into the cliches she uses in her writing because it's actually not particularly good.
00:19:21
Speaker
And I remember reading the first paragraph of this fanfic. and going, I think that's my off-ramp right there. I don't need to go past the first paragraph of this. I can tell I'm not going to like it.
00:19:35
Speaker
Yeah, but do it some people do like it. Some people like it a lot. People who like to imagine they're the smartest person in every room they're in, I think, because that's essentially what Harry Potter is written in.
00:19:46
Speaker
is that there are a lot of historical positions which have been called rationalist in the past. And I don't think they're using rationalism in the sense that maybe philosophers use rationalism. I think they're using rationalism as in, ooh, I'm an argumentative so-and-so and I think I'm really clever. That's what my mum told me. Yes, my understanding is that um the the less wrong folks are big fans of bayesian Bayesian logic, that being logic which rather than relying on two values, one or zero, true or false, what have you, um assigns values between one and zero to various things and then sort of does does various sorts of probabilistic maths on them.
00:20:32
Speaker
And it's it's our it's it's ah the experience I've had of it, the the little exposure I've had to it, has really been a lot of a lot of the thing of people just desperately trying to turn stuff into numbers.
00:20:46
Speaker
It's the whole thing of u these things, they're all complicated and complex and very difficult to to understand. But if we can put numbers on them, numbers are great. We can work with numbers. We can make numbers do whatever we want.
00:20:57
Speaker
And so there's the the term Bayesian priors gets bandied around a lot, which is the smart-ass way of saying assumptions. but but But if you call your assumptions Bayesian priors and put Bayesian probability values on them, suddenly it sounds like you're being serious and objective when, in fact, you're just baking your own biases and and and uninformed assumptions into whatever maths you end up doing.
00:21:23
Speaker
Now, I do want to step in here and defend Bayesianism. Oh, yes, yes. But I'm not going Because I think the issue is these rationalists, and I'm putting the air quotes around it, because there are respectable positions called rationalism in philosophy, are people who just don't understand how the Bayesian calculus works. They just think you can put numbers on things and then prove things. And the whole point about Bayesianism is it's is it's ah it's a kind of a balancing act. Once you get a particular conclusion, you need to update priors, because some of those priors are posterior, some of them are prior priors, some are relative priors and the like. If you're going to be a Bayesian, you need to be very sophisticated in your reasoning.
00:22:08
Speaker
And the problem for these rationalists is that they like Bayesian logic because you can put numbers on things and prove things through math. What they don't like is being told to update their prize in light of new evidence.
00:22:23
Speaker
They don't like that kind of thing at all. No, yes, yeah I should be clear. Bayesian logic is a real thing. it's It's a real thing and a respectable thing. My issue is with the way it is Respectable is a matter of interpretation. There are philosophers who don't like Bayesian logic at all.
00:22:39
Speaker
But it is nevertheless something that is discussed seriously. Bayesian inferential systems is probably the bit better way to put it. But it is a thing that is actually seriously entertained and discussed in academic circles. first paper, when inferring to a conspiracy theory might be the best explanation, started off as being a purely Bayesian paper. And then halfway through the writing, I went...
00:23:03
Speaker
I can actually just get by on discussing inference to any old explanation and inference is the best explanation. I don't need the numbers here. Actually, putting the numbers in makes the story unnecessarily complex, for at least the kind of argument I'm trying to run. Yes. Now, the interesting thing or an interesting thing about um Harry Potter and the there's methods of rationality or principles of rationality, I don't actually care anymore, um is that it was the favourite fan fiction of some of the people involved with Sam Bankman Freed and that whole fiasco.
Rationalist Fiction's Influence and Effective Altruism
00:23:41
Speaker
they They liked... there's There's this little sort of meme in the book, apparently, where Harry Potter... They do the whole thing of um ah essentially I'm the smartest person in the world, so I should probably actually be in charge of it. Like the world would actually be a better place if I were the dictator of the world and everybody did absolutely everything I said. In fact, you know, the the ideal outcome would be if I took over the entire world. And he has this cute little thing, apparently, where Harry Potter and these books will talk about, rather than talking about um world domination, he'll talk about world optimization, which is their cute little term for wouldn't it be great if i if if I took over the world because I'm the smartest person alive and would do everything right.
00:24:27
Speaker
um And so SBF and his folks apparently were were fans but of talking cutely about ah about world optimization, which leads us nicely into the next bit, effective altruism, because a decent chunk of these rationalists are also effective altruists.
00:24:43
Speaker
Now, again, I understand like effective altruism could be legitimate philosophical position. Could it not? It's just in my experience. I mean, it's a legitimate philosophical position. So essentially, effective altruism is taking the utilitarian calculus, often looking at Peter Singer's version of consequentialism, and going, well, look,
00:25:10
Speaker
What you need to do to make the world a better place is maximize utility. And the people who can maximize utility are people who have access to a lot of resources that they can then distribute.
00:25:25
Speaker
So you need to get rich in order to be able to engage in... patronage or giving to the poor. So effective altruism, if you want to maximize utility, requires harvesting a large amount of resources to then effectively redistribute. So it's the idea that individuals doing small piecemeal things on their own won't produce the kind of large-scale effects that we need in order to make the world a better place. Yes, and Peter Singer, as as I recall from my my um applied ethics papers I did 20 years ago now, he was very extreme in his utilitarianism to the extent that saying, you know you could have... it was There's an episode of The Simpsons where Homer goes to hospital and he sees Ned ned Flanders is in there donating a kidney and he says, who are you donating a kidney to? And Ned Flanders is like, first come, first served.
00:26:22
Speaker
That was basically Peter Singer's position all the time. We're actually obliged to give up... our organs to people who who need them, because that's the best way to inform, to maximize utility and so on. And then this, like some of the, I don't, I never know if there are examples put forward by effective altruism or or people trying to provide counter examples to effective altruism, but they would say things like,
00:26:46
Speaker
you know if you if If there's a burning building and there's a small child and a very valuable painting, you're possibly better off saving the painting than the child because the painting can be sold and you could do a lot you could save a lot more lives with that money than the one life you would save by rescuing the child.
00:27:02
Speaker
It tends to be, as you get with utilitarianism in general, it tends to be just a bit sort of dehumanizing in that people are just these these utility units and the best thing to do is is maximize the the utility amongst these units and individual human suffering or or circumstances really get much less of a look in.
00:27:24
Speaker
Yes, well, no i mean now we can get into the kind of complications of utilitarian thought, given if you are a consequentialist, there is that question, do you save a stranger who's carrying a Nobel Peace Prize in their hand, or your grandmother who's about to be run over by by the same car?
00:27:42
Speaker
And the utilitarian would say look, it's quite obvious if you are considering all outcomes, Saving your grandmother is a bad idea. Saving the Nobel pit Peace Prize winner is probably a better option. They'll do more good.
00:27:57
Speaker
And utilitarians go, look, that will speak against your intuitions, but your intuitions aren't reliable moral guides. Whereas other ethical systems go, no, actually, there is something interesting about interpersonal relationships which means that actually it might be better to save your grandmother and let the Nobel Peace Prize winner get run over by a car but that's probably a debate for another time and probably not on this podcast not on this podcast we're ever going to have no we'd end up talking about trolley problems and all that sort stuff and I think The Good Place sort of the trolley problem much more effectively than either of us could
00:28:38
Speaker
Go watch the good place. Well, it was visual medium. It was, yes. ah But this then leads into the final the final bit, to the L of Tescriel, which is long-termism.
00:28:50
Speaker
um And this is this is the the same sort of effective altruist sort of utilitarian view that places emphasis on the really long-term view, which can say things like,
00:29:04
Speaker
You know, the human race, if if we get things right, the human race could exist forever. if If, as you say, we escape the planet and in extinction events and what have you and and spread out ah along across the universe, there could be as close to infinite as damage number of human beings in the future.
00:29:23
Speaker
So surely the long-term effects that are going to affect all of these countless humans extending off into infinity matters more than the individual circumstances of human beings right now.
00:29:36
Speaker
And that, that, that lets you come up with all sorts of stuff. Well, yes. So, I mean, let's combine long-termism and effective altruism. If effective altruism is about getting enough resources to be able to do the most good, well, I mean, doing the most good today doesn't make any sense, Josh, because there'll be more people tomorrow.
00:29:57
Speaker
you should do the most good tomorrow with your money. Ah, but there'll be more people the day after next. So maybe we wait three days. And so one of the complaints about effective altruism is that there's no cutoff point as to when you should do the good.
00:30:13
Speaker
And when you add in long-termism, as in, well, we want to have the biggest effect, you might go, well, I mean, maybe what I should do is become a trillion-dollar bastard.
00:30:26
Speaker
And then... At some point in the future, i might give back to humanity, but it doesn't make sense to do it now. Now would be too early. Be better to do something in the future. That's why I'm sitting on my piles of cash. Not because I'm a terrible person. No, because I'm waiting for the right moment to act, which will never be today, but it might be to- tomorrow.
00:30:48
Speaker
Or maybe the day after. or maybe the day after that. Yes, yes, exactly. that that's that That really is the thing. um Supposedly, it's it's about make as much money as you can so you can then do a bunch of good with it, but the the make as much money as you can portion of that never really ends. mean, you could extend that further. You could say, well, I should make as much money as I can throughout my entire lifetime, um and then maybe I should leave that to like-minded people who can continue to. I should i should create an empire.
00:31:20
Speaker
to um to to to just keep making money forever and then and then and then do good at some point. In practice, what it actually means is um people justifying the pursuit of wealth over everything else with this supposed idea that, oh, yes, we're going to do good with it eventually. But instead what they actually do is, as Sandbagman Freed famously did, it would start up a cryptocurrency exchange, embezzle a whole lot of money, and then go to prison.
00:31:49
Speaker
And, i mean, the Sam Bankman-Fried thing is also the example of someone who has spelled defective altruism and thought that that justified doing some harm now because the scale of good he would do in the future would be so great, it would override the harms committed here and now.
00:32:10
Speaker
So it also ends up being a case of, well, I'm going to be good eventually. But not right now. But when I am good, I'm going to be so good, so unbelievably good, that the bad things I've done to get there, people are going to forget about them.
00:32:24
Speaker
or or not forget about them, but they'll all have to acknowledge that that it was for the best in the end.
AI as Savior or Threat: Long-termism and Global Catastrophes
00:32:32
Speaker
Now, it's probably no surprise that, as I said before, AI comes up a lot when you're talking about these people. Long-termism comes up a lot when it comes to AI, because if we can get ai's there's a lot. The people who don't think that up don't think that AI is going to destroy us all and instead think it's going to save us all are very much about we need to make whatever sacrifices we can make right now to create general artificial intelligence so that it can then save us all.
00:33:02
Speaker
To the point of we need to destroy the climate now, making AI work so that AI can then tell us how to save the climate, which nobody seems to notice isn't isn't a particularly sensible position.
00:33:15
Speaker
Well, part of the problem here that in this test reel process, complex it is just assumed that ai in the sense of a real artificial intelligence as opposed to the kind of dumb stuff and i mean dumb here in a technical sense llms and pseudo a agi real ai is inevitable it is going to happen They don't kind of factor in, but what if it doesn't? and No, no, but it is. It's going to happen. There's no way to avoid it.
00:33:52
Speaker
It is going to occur. And thus they worry about how it's going to occur. And they also worry how it's going to act when it looks at what we've done in the past. Oh, yes. Yes, we haven't mentioned Rocco's Basilisk. One of my favorite bands from the 1980s, I have to say. Bloody should be. It's sort of a reverse Pascal's Wager for computer geeks. It's some, I can't even remember the exact details, some torturous idea that in the future there'll be an ai and it will retroactively punish everyone who doesn't work towards its creation.
00:34:30
Speaker
But then the method of punishment is it involves... making making copies of our past selves into ah as as artificial intelligences in ah in a matrixy network in its own time, which it will then torture. So i don't know for starters why we should care that a copy of us will get tortured at some indeterminate point in the future. But anyway, the the end point of it is that simply it was sort of described as this kind of info hazard because supposedly just by thinking, just by becoming aware of this example, you now have no excuse for not working towards this AI and avoiding eternal torture in in the matrix or something. It sounds very silly, but apparently some people did have take it seriously enough to have have mental breakdowns of a kind and they they are said,
00:35:23
Speaker
I think Mr. Yudkowsky and other people the strong side, it's a silly idea, but let's please stop talking about it because it's upsetting people. But that is just one idea of of things that can go wrong, because this has been um and interesting in a philosophical sense. We haven't got particularly conspiratorial so far, but some of that does come in when you start looking about...
00:35:46
Speaker
at at at these supposed threats to humanity's existence that these ideologies believe in. And usually they're quite, you can tell what science fiction stories the particular person has been reading recently because they're all sort of sci-fi concepts like nanotechnology turning everything into grey goo or an evil AI taking over the world or just various kinds of societal culture. exactly.
00:36:13
Speaker
And again, climate change, climate collapse doesn't seem to get Nietzsche with nearly as much enthusiasm as nanomachines. It's not as sexy as being killed by giant robots. No, no, exactly. Dying a slow heat death is taken to, well, you know, that kind of thing happens all the time. But giant robots. Yes, exactly. on a related matter, there was a paper out recently which is trying to explain why we don't see evidence of extraterrestrial civilization scattered throughout the universe, given that the age of the universe and the age of particular stars within the universe means that
00:36:53
Speaker
Earth-like planets, or at least planets which are capable of supporting life, have been around for a long time. So we should see remnants of alien civilizations, at least with respect to radio signals from various sources around around the ute universe. And the argument that was put forward by...
00:37:12
Speaker
the people who wrote the paper is that climate change generally just wipes out almost every single sapient species that might exist because the problem with technology is that as civilizations become more and more technoologically technologically advanced they start producing machines those machines produce excess heat Runaway climate change is a result of producing too much heat on your planet and therefore most civilizations don't get to the space-faring stage because they die a heat death before they're able to make their mark upon the universe. So we should be very afraid of climate change because according to the arguments in this paper, it's what's wiped out almost every other potential.
00:38:02
Speaker
extragalactic civilization that that's that's not the word I was looking for every other space potentially space-faring civilization that could have existed e Yeah, but so things, when considering these sorts of doomsday sit scenarios, things start to get conspiratorial when they start looking into who might be either bringing about these these apocalyptic scenarios or, on the other hand, who might be preventing us from achieving technological nirvana, be that the singularity or whatever else.
00:38:37
Speaker
Yeah, there's there's something really interesting going on here, and we kind of touched on this earlier in the episode. Silicon Valley thinks it's found God, and they think that, according to the Rothschild's Basilisk Thought Experiment,
00:38:55
Speaker
When this super AI appears, it's going to look back on what people did to either bring it about or stop its creation.
00:39:06
Speaker
And they are very concerned that the god which they found, the god that they're going to create, is going to be a wicked and vengeful god. So they are speed-running theology as they prepare for what they take to be an unavoidable creation of a machine god that will judge us upon its creation and either take us to the stars or wipe us out.
00:39:32
Speaker
And there's something very weird about this, because This might explain the kind of behavior we're getting around the bubble of ai and LLMs in particular.
00:39:47
Speaker
Because many people are pointing out that the behavior of Silicon Valley investors at the moment is not rational when it comes to the AI boom.
00:39:58
Speaker
Because most of these AI companies are not making money. They are projected to lose money. They're spending large amounts of their cash reserves on building power plants at a point in time where we should be trying to reduce power consumption rather than increase it.
00:40:17
Speaker
And the argument is, well, these investors aren't acting as rational investors under the market. They are acting more like religious fanatics. They've bought into the inevitability of the machine god and the singularity. And they are investing in their own future.
00:40:35
Speaker
we go, oh, when the machine god looks at me, they'll see that I tried to make sure it came about. Yes, I was there on its side from the very beginning. Spare me, machine god. Spare me. Yes, I mean, from the economic side of it, I've heard the argument that Silicon Valley needs AI to be the next big thing, needs to be this transformative technology that is going to change absolutely everything about the world, to the extent that you know even the people who think AI is going to save us all and the people who think AI is going to doom us all are essentially...
00:41:10
Speaker
They're on the same side because they both believe ai is this inevitable transformative technology um that that that that we all just have to get behind, um whereas the argument has been...
00:41:24
Speaker
silicon valley it was Silicon Valley investors need to believe that because they need something to chuck all their money into um and and other kind of technologies have kind of run their course a little bit, which is um which will be but an idea that's going to come up again shortly.
Peter Thiel's New Zealand Citizenship and Apocalypse Preparedness
00:41:43
Speaker
as think all of this all of all of this leads to the recent the recent pronouncements of of of our fellow countryman, Peter Thiel, New Zealand citizen Peter Thiel. the German-American New zealand Zealander tech mogul in charge of Palantir.
00:42:03
Speaker
A terrible, terrible human being. Yes, Peter. I assume you're probably familiar with him, but if you're he made most of his money as the founder of PayPal. He and Elon Musk, because back back at the time Elon Musk was trying to get his initial version of X, the everything app, off the ground, and X and PayPal sort of merged and then...
00:42:23
Speaker
Teal kind of came out triumphant in that and and and Musk went off and did other things. um And these days he's known for, what's he known for? Yes, founding Palantir, the sort of surveillance company. Founding a company based off of the ah the artifact from Lord of the Rings that the evil wizards use to spy on the good guys.
00:42:44
Speaker
I don't get... Josh, you'll find that in the early days the Palantirs were used by anyone and they had no evilness but yes, it is true in Lord of the Rings they're essentially mind-controlled device by Sauron Yeah, um he did that. He's also famous for um killing Gawker by funding. it' So Gawker, I think. so sorry From over there, I thought you meant Sebastian Gawker, former White House age. You mean Gawker, the news site. Gawker with a G-A-W-K, yeah.
00:43:20
Speaker
Because ah Mr. Teal is gay. i believe Gawker sort of outed him in a fairly shitty way, which I think we'll agree is a bad thing to do. um But Mr. Teal, in revenge, bankrolled Hulk Hogan. Hulk Hogan comes into this. but Gawker published a leaked sex tape featuring Hulk Hogan, and Peter Teal bankrolled his case against Gawker, which killed Gawker and had a sort of flow-on effect that... that kind of kind of messed up a lot of um a lot of media, especially internet media, as it exists today. ah he's He's also, like a lot of these tech people, very obsessed by the idea of his own mortality and all about prolonging his own life through any means. one of those is He is literally a Silicon Valley vampire capitalist because he...
00:44:12
Speaker
injects the blood of other people into his own body on the belief that young blood is better than old blood. Yes, I don't know if he's still doing that, but he definitely used to and and could could well still be for all I know.
00:44:25
Speaker
um he also, as we say, is a New Zealand citizen. And this was very contentious at the time in New Zealand because... Like New Zealand, like most countries, has various paths to citizenship here. They usually involve coming here through legal means and applying and staying and living here and all of that business.
00:44:44
Speaker
um There are fast tracks if you're rich, basically. you can You can get sort of citizenship expedited by pledging to...
00:44:56
Speaker
Basically, give a bunch of your money to New Zealand's economy but by investing a decent chunk of money in New Zealand industry and things like that. And then there's Peter Thiel, who chucked a bunch of money at New Zealand, having spent, I think, 11 non-continuous days in the country total, having said explicitly that he has no plans of living here for the time being,
00:45:22
Speaker
um and yet was granted New Zealand citizenship on that. And there there was a lot of fuss at the time. This this guy basically bought bought a New Zealand passport, which is not the way it's meant to work, and yet that appears to be what it did. And the whole reason, of course, he did is because it's it's the whole um insurance against the apocalypse thing. A lot of these rich guys have have identified the fact that New Zealand is indeed small and remote,
00:45:45
Speaker
um but English-speaking part of the quote-unquote Western world and possibly would make a good bolt hole um if things turn to custard elsewhere.
00:45:56
Speaker
um So i don't think he's the only one, but um yeah, that's ah that's that's basically the only reason why he has New Zealand citizenship, so he can flee here if he wants to.
00:46:06
Speaker
Yes, and the argument put forward by the John Key government of the day was they expedited his citizenship despite the fact he hadn't spent enough time in the country on the basis of the large number of investments that Peter Thiel promised to make into the economy of Aotearoa New Zealand. And as we found out either late last year or early this year,
00:46:33
Speaker
Virtually none of those investments were made, and the money that Teal did invest to get his citizenship has now been removed from the country.
00:46:44
Speaker
So he has no financial stake, has never made a financial stake in the country. He simply seems to have bribed his way into citizenship. And this has led to a very interesting discussion back home, because stripping someone of their citizenship is difficult and rightfully so.
00:47:04
Speaker
But many people think it was a huge injustice giving him citizenship in the first instance. Yes. Yes, I know, as far as I'm aware, renouncing New Zealand citizenship isn't a thing. Like you can say, I renounce my New Zealand citizenship, as I believe you're required to do to get American citizenship. I think in some cases you have to renounce your form of citizenship.
00:47:29
Speaker
um But that doesn't actually mean anything in New Zealand. You can you can say it, but you will still be a New Zealand citizen if you if you want to be. Yeah. Anyway, the point is Peter Thiel, questionable individual.
Thiel's Apocalyptic Themes and Tech-Religious Fusion
00:47:41
Speaker
um And even more questionable are the things he's been saying recently about the apocalypse and the Antichrist. He's been giving a bunch of talks. I i um heard about his... He gave a series of four talks to a sort of closed audience and recording wasn't allowed. But my understanding is that one of his one of his fans...
00:48:06
Speaker
came away from his first talk and wrote down, sort of took took notes and wrote down everything that Peter Thiel said in his um first talk and published this online, not as a sort of expose. It was meant to be, look look look at these clever things Peter Thiel has been saying, isn't he so clever?
00:48:25
Speaker
um But because this this stuff wasn't actually supposed to be made public, I think he was then barred from attending the other four talks. but I think some of this has gone out, but certainly what we know of the first one, yeah, he's been thinking a lot about the apocalypse. for For a person, you sort of expect these tech magnates to be...
00:48:46
Speaker
a bit more atheistic and science-minded, perhaps. But as you say, they really do seem to have found religion. And indeed, pete in Peter Thiel's case, a lot of his stuff seems to be about making a new kind of Christianity, as people have done. yeah There are various instances of the past of people sort of inventing a kind of Christianity that is consistent with their own beliefs, even though it's not necessarily consistent with Christianity as it existed before. And so this seems to be an attempt What we need is a version of Christianity that doesn't care about poor people.
00:49:20
Speaker
Jesus just had the wrong focus on poor people. He hung around with poor people. He advocated for the rights of poor people. But if Jesus had grown up with rich friends, he would understand they are the people who deserve salvation. So really, all we're doing is re-educating Christ. Well, I mean, kind of. Yes, so it really does seem to be coming with a brand of Christianity that says it's it's the right and Christian thing to do, to be a billionaire tech mogul, hoovering up all the money in the world, destroying the the climate in order to bring about AI and what have you.
00:49:56
Speaker
um But yeah, Peter Thiel, very exercised about the apocalypse and the Antichrist, who, of course, is the person who will bring about the apocalypse. Now, I just want to step in there because the term Antichrist is often taken to be synonymous now with Satan, Lucifer, or the herald of Satan and Lucifer before the advent of the end times. But if you actually read about the Antichrist in the context of the Gospels, where the term first appears, or in the book of Revelations, which also talks about the Antichrist, it simply means Antichrist.
00:50:31
Speaker
any person who denies that Jesus Christ is the Son of God. So we, Josh, are antichrists. You're an antichrist. I'm an antichrist. Many of the people listening to this podcast are antichrists as well.
00:50:46
Speaker
Well, I mean, that's, that I think Peter Thiel will probably agree that a great many of us are, if not actual antichrists, then certainly agents of the antichrist. yeah but But he's taking it that the Antichrist is the thing which brings about is the herald of the apocalypse. So the whole point of the Antichrist is simply, no, people who are Antichrist are Antichrist. That's what the term means. It's a person who's Antichrist. who um Well, let's see. so So the explanation of Peter Thiel's views that I heard said that certainly in this first talk, as it was reported on, he based a lot of his views on his reading of the Book of Daniel, which I'm i'm i'm not super familiar with the Bible, but I understand the Book of Daniel.
00:51:33
Speaker
Daniel kind of the lyrics to that Elton John song? no no, actually. It's not about getting on a plane and and and his brother... doing something. I'm your brother.
00:51:47
Speaker
it was also the lyric I can actually remember offhand. You are older than me. Do you still feel the pain? But, um... Yeah, that that one, i get i I was told there was this missing verse that actually explained what the hell it was talking about, which doesn' think but which never seemed to make sense because the first verse is repeated twice. So if you I don't know, it doesn't matter. No, we're not talking about Alton John's song, Daniel. We're talking about the book of Daniel, um which I gather is is is quite apocalyptic. It's a Washington film? No, that's the book of Eli. That's the book of Eli. No, no. But but it is good there good and post-apocalyptic, though, I have to say.
00:52:19
Speaker
um So it's it's like the Book of Revelation in that apparently it it includes a bunch of sort of visions of the apocalypse and what's going to happen in its in its in its structure. Now, apparently from his talks, Peter Thiel seems to think that the Book of Daniel was actually written by the biblical Daniel, where most historians believe that it was actually compiled several centuries after Daniel's time.
00:52:47
Speaker
and it's sort of a compilation of these apocalyptic things. Well, I mean, if we're going to get into the history of the writing of Jewish scripture or the Old Testament of the Christian Bible, then almost all of the texts are attributed to four groups of writers writing at particular times, some of whom are rewriting other parts of what was written.
00:53:13
Speaker
So the argument goes, even if there was a source document in the book of Daniel, that is, the prophet Daniel's words, in his own words...
00:53:25
Speaker
It would have been rewritten and then rewritten again because there was a certain amount of ah eradicating certain concepts from Jewish scripture and reinforcing certain political realities in those scriptures as well.
00:53:42
Speaker
So all these things were being rewritten all the time in order to fit the orthodoxy of the day. Yeah. um And so apparently in his reading of the book of Daniel, he he does the thing where he'll insist that certain things are to be taken literally true, except for when that's not convenient and then they're metaphorical. Apparently the book of Daniel, for instance, talks about how the apocalypse is going to follow the falling of four great empires.
00:54:08
Speaker
And I forget what their idea, do I think two of them had been identified in biblical times, like the Mesopotamian earlier, one in the Egyptian or something. And then it's like, well, then then there was the Greek Empire, which kind of fell, and then kind of the Roman Empire. And there have been a bunch of them, but there's been the Ottoman Empire and the British Empire and the Turkish Empire or what have you.
00:54:29
Speaker
Turkish Prussian Empire. you know There's definitely been at least four empires have fallen ah in human history, so how come the apocalypse hasn't happened yet? At that point, that that's when that's when Daniel was being metaphorical, when he was talking about about these particular um conditions for the apocalypse happening. But... um Peter Thiel definitely thinks the apocalypse has not happened yet, but it will. It's gonna.
00:54:55
Speaker
And he talk talks about how it's going to happen. And a lot of it was familiar to things we've heard before. He talks a bit about this the whole sort of one world government stuff that is supposedly going to going to be set up by the Antichrist. A lot of it does read an awful lot like the second two Omen films.
00:55:10
Speaker
Or indeed, The Omega Code, the greatest movie featuring Casper Van Dyne and Michael York. The first Omega Code fine Bible thriller, it's Omega Code 2 Megiddo. That's the one which is good. Well, that does have Michael Byne. It's Michael Byne.
00:55:29
Speaker
um I forgot, why I'm just too busy thinking about Michael Byer now. No, no, he thinks there's going to be... See, I was thinking, I'm thinking about Michael York literally yelling to the sky and telling God to bring it on.
00:55:40
Speaker
Yes. Excellent films, excellent films, by which I mean terrible. But no, Peter Thiel thinks there's going to be this one world government and that's goingnna that's going to sort of cause cause a lot of the, or bring about a lot of the conditions that will cause the apocalypse.
00:55:55
Speaker
Now, apparently he thinks that scientific advancement has stopped. And and this the cease of scientific advances may actually be a goal of the Antichrist or this this nascent one world government.
00:56:08
Speaker
Now, it wasn't quite clear what he meant by that. It seemed perhaps that what he was saying is that the only scientific advancement that he cares about, which is consumer technology, has stopped in the sense that, like, cell phones...
00:56:23
Speaker
the Cell phones and TVs and computer processors, you know, they get incrementally better each each year. Each model is a little bit better than the one before, but there's been nothing revolutionarily and transformative, certainly not the way that, say, the the the smartphone, iPhone form factor revolutionized cell phones and things like that.
00:56:45
Speaker
It doesn't know much about also how consumer technology works because one of the reasons, I mean, it is arguable, and by arguing it may or may not be the case, that tech plateauing to a certain extent.
00:57:01
Speaker
So every year's phone seems to be ah more marginal, incremental change than last year's, and battery life just isn't getting much better better, but that's because we are we're fighting physics now when it comes to miniaturization. Everybody wants their phone to be small.
00:57:22
Speaker
And because it has to be small, we're packing as many transistors as possible into very small space, and physics only allows us to do that to a certain extent.
00:57:33
Speaker
Yes. So it's only plateauing because of consumer preferences. If people allowed their phones to be bigger or have larger batteries, then maybe we would see the kind of radical change, but that's not what the people in charge of the stock market seem to want. No, but certainly, mean, even if we are being that charitable there, it's complete nonsense to suggest that scientific advancement has ceased to even slow down. Yeah, consumer tech may be slowing down, but we're discovering cures for cancer.
00:58:06
Speaker
That's something which is actually quite remarkable. And we're doing that through some of the technologies that these people fear, like using mRNA treatments and the like. Yes, but so I think...
00:58:20
Speaker
a lot of these yeah a lot of the A lot of the biggest scientific advancements recently have happened in areas that Peter Thiel doesn't know much about and therefore don't exist as far as he's concerned. um An interesting upshot of this is that because the Antichrist is trying to halt the flow, halt scientific advancement, that means that you can't be both a scientist and an atheist.
00:58:43
Speaker
It's not possible. Because if you're a scientist, you're trying to advance technology, and that's the opposite of being the Antichrist. And that means that means you must be the the the pro-Christ.
00:58:54
Speaker
No atheists in science, you heard it here first. This is kind speaking to their kind of weird view about the inevitability of their machine god. So, I this is kind of the weird thing about Thiel's speeches here, in that he's very concerned about the Antichrist, but it's not clear he's actually very concerned about Christianity.
00:59:15
Speaker
He seems to be more concerned about the god he's going to create and what's acting as an Antichrist towards that. than he is about traditional Christian theology. Yes, yes. As you say, he is kind of looking to create a new religion that keeps some bits of Christianity but goes off in its own whole direction. Because, yes, surprise, surprise. Christianity with batteries. Surprise, surprise, AI is the thing that's going to save us all. AI a i will avert the apocalypse. AI will defeat the Antichrist.
00:59:47
Speaker
Therefore, we need to put all our our efforts and our resources aside into the emergence of And indeed, anyone who's trying to stop AI or who is opposed to AI, they are literally working for the Antichrist.
01:00:00
Speaker
That is true. or they are Antichrists themselves. yeah So um peter Peter Thiel, I don't know that he said anything definite. I don't know if he thinks the Antichrist definitely exists, but he certainly thinks the Antichrist could exist in the here and now. He suggested a person like Greta Thunberg might be the Antichrist.
01:00:20
Speaker
He apparently, um he is not fond Elijah Yudkowsky, but who's apparently quite anti-AI these days on account of the... Any of these people who think AI is going to destroy us all and therefore AI is probably a bad thing, um Peter Thiel thinks anyone who believes that is is is ah is ah is an agent of the Antichrist.
01:00:42
Speaker
um Now, the fact that that Peter Thiel stands to make a lot of money off of off of AI is is purely a coincidence, I'm sure. But at the end of the day, when when you look at everything he's been saying and what a lot of these people are saying, it really does just seem to come down to self-justification of of the stuff that they already believe or that everything they want to be true.
01:01:07
Speaker
And this brings us back to kind of the long-termism effect of altruism problem. Because I do think Peter Thiel genuinely believes he is a good person.
01:01:19
Speaker
And he believes that he's accruing all of this money and power in order to benefit the human race. So he's thinking in long-term response, and he's going, and I'm going to use that power for good.
01:01:35
Speaker
But the other stuff he believes... means that even if he sincerely believes he is a good person out for the good of the human race, his other political beliefs are undermining those efforts, even if we grant that long-termism and effective altruism work.
01:01:57
Speaker
And I don't. I just reject both of them. No, not fair myself. even we grant that it's a problem for his view. yep And so that, I think, is all we have to say for the time being.
01:02:09
Speaker
about Peter Thiel and Tescriol. and still never got a decent a decent rhyming couplet out of it, although I'm sure we can't. No, I mean, we got we got real with Peter Thiel and Tess Real. And Tess Real. That's not bad. That's not bad. oh I'll accept that.
01:02:25
Speaker
What was that, Ben? And Chanel? Anyway, there's something there. There's something there. We can workshop it later. Yep. But that'll that'll do for now. So now, of course, we have to go um and record a bonus. of you know I know last time we said we were going to do the Anselmo case, didn't we?
01:02:40
Speaker
We made that promise. and unfortunately... news games came to hand and we were unable to do it but this week we are definitely going to do the Anselmo case because it is really really fascinating example and i dot I do think it's one that doesn't get talked about enough so this week definitely the Anselmo case yeah So tune in patrons. If you're not a patron and you wish to tune in, then you can just become one by going to patreon.com and searching for the podcaster's guide to the conspiracy. Or you could just not do that.
01:03:12
Speaker
It's entirely up to you. Yeah, I mean, as Josh would say, we're not your mum. We're not. Neither of us. And if we are, please get in contact because it's a surprising fact. It is, yes. I think I'd like to know that.
01:03:25
Speaker
Righto. Well, that'll do for another time. so until we mixed we next we next publish an episode, um yeah last time last time I did the conspiracy you later sign-off and you said, aren't you retiring that? And I was like, yes, I was going to retire that, but I don't have anything to replace it with.
01:03:43
Speaker
So I think I'm just going to have say conspiracy see you later. Or will you? these
01:03:56
Speaker
The podcaster's guide to the conspiracy features Josh Addison and Associate Professor M.R. Extenteth. Our producers are a mysterious cabal of conspirators known as Tom, Philip, and another who was so mysterious that they remain anonymous.
01:04:11
Speaker
You can contact us electronically via podcastconspiracy at gmail.com or join our Patreon and get access to our Discord server. Or don't, I'm not your mum.
01:04:38
Speaker
And remember, according to the Copenhagen interpretation of quantum mechanics, a stranger is just a friend you haven't met.