Introduction and Mishap
00:00:02
Speaker
Oh, we are live. Hey guys. Hi. I, um, went to go start the show and I realized I forgot to switch the intro. So I took the intro off real quick, which led us right to skip past the intro.
Morning Routines and Holidays
00:00:18
Speaker
So, which is unfortunate is that will happen sometimes, but sometimes I can do it.
00:00:38
Speaker
The shortest pre-show ever. It put me in the mood. I'm better now. I got a little bit of background mood music going on, if you guys can hear that.
Introducing 'Trolley Problems'
00:00:51
Speaker
Michael can't. He's the only one that can't. Oh, yeah. I'm sorry. i am It is a slow, slow-roll morning for me. oo Day after a holiday.
00:01:04
Speaker
i didn't even I didn't even celebrate. I actually went to bed fairly fairly early after work. And then I'm trying to get back in the habit of sleeping in to like 10, 11. I know that sounds lazy, but I like sleeping in.
00:01:20
Speaker
But I think my sleep schedule is all seasonal. and Anyway, I woke up like extremely way too early this morning. Took an edible, had some melatonin tea because totally forgot what morning it was. I was spacey.
00:01:32
Speaker
My alarm goes off at 9 and my brain is like, oh no, function damage. Time to go, man.
Clones on the Track Debate
00:01:40
Speaker
Yeah, yeah, i didn't even get breakfast in this morning. So when I called Michael, he's sitting there stuffing his face. I'm like, I'm fucking jealous.
00:01:50
Speaker
Yeah, we got certain. Is that where your sister mommy rejects you and your cousin brother's offer to run a train on her? ah but That could be a trolley problem, I guess. It's definitely a problem.
00:02:08
Speaker
Oh my. A couple more. I like this one. Would you pull the switch to save your dog or do nothing and let your boss die?
00:02:20
Speaker
Sorry, Sammy. I love my dog more.
00:02:29
Speaker
What about you, Michael? I don't have a boss. Oh. In which case, no go. Sorry, Fido. who All right.
00:02:44
Speaker
Good morning. I know. It's like a slow roll. I told you. I told you. But, you know, let's we can start getting right into it. we What are we doing this morning? Trolley Problems, man. Oh, yeah. Trolley Tuesdays for a little while.
00:03:01
Speaker
Yeah. So it stops being fun. And I don't think that's going to happen.
00:03:06
Speaker
Man, I brought up the website this morning and it it brought is like you wish to continue or start all over. I was like, I'll continue. So let's go ahead and just go ahead and kick in with level 16 clones.
00:03:18
Speaker
Oh, no. trolley is barreling towards five identical clones of
Mystery Box and Statistical Outcomes
00:03:23
Speaker
you. You can't. You can pull the lever to divert to the other track, sacrificing yourself.
00:03:31
Speaker
are you Yeah, yourself instead. What do you do?
00:03:36
Speaker
I'm going to kill the clones, man. Fuck them. Five, I got the clones of you. Yeah, I'd buy clones.
00:03:47
Speaker
pull it So you would pull the... Wait, no, you would do nothing. The world cannot handle five of me. that's the way I'm thinking. Like, see, 89% of the people, there's 11% that would, man, I bet they're evil too. They're like, but my clones can take on the rest of the world for me.
00:04:07
Speaker
And that is a certain thing. I mean, if you think about it, like if they're your clones, like they're in essence you, man, don't know.
00:04:19
Speaker
I mean, yeah, I only, i only want to be the only, There can be only one, Josh. There can be only one.
00:04:29
Speaker
Anytime someone in public says, you look just like so-and-so. I'm like, oh, well, I assure you, if there's a twin of me out there, I'm the evil one. I get that a lot, dude. I got one of those faces that must have a lot of doppelgangers out there. because I get that a lot. Yeah.
00:04:45
Speaker
Britney's made a comment about somebody she knows that looks sort of like me. A customer comes to the drive-thru the liquor store who said that recently. I get that a lot. so Number 17.
00:05:00
Speaker
Oh, no. Oh, no user voice You got to read all of them. I cannot read these this morning.
00:05:12
Speaker
I know. I know. I know you got a smaller screen than I do, and that's okay. Oh, no. a trolley is heading towards a mystery box with a 50% chance of containing two people You can pull the lever to divert it to the other track hitting a mystery box with a 10% chance of 10 people
Sentient Robots vs Humans
00:05:31
Speaker
instead. What do you do?
00:05:32
Speaker
Man. See, I'm not a statistics person. I'm hitting the 10% box. That's what I was thinking too. It seems like you have a ah less chance. Yeah.
00:05:45
Speaker
yeah Yeah. Yeah. So yeah, going to, yeah, I was thinking, pull the lever.
00:05:54
Speaker
Ooh, we got, we're in the my majority on that one. I would hope so. This one really doesn't make sense to me. I mean, it does. I guess, guess if I was a stats person, I'd be like all gi giggly about it.
00:06:09
Speaker
I'm really not. I'm really not. It's too early to math for me. Maybe. coming
00:06:21
Speaker
Number 18, I am robot. Oh my, oh no. a trolley is heading toward five sentient robots. You can pull the lever to divert to the other track, killing one human
CO2 Emissions and Environmental Impact
00:06:34
Speaker
instead. What do you do?
00:06:37
Speaker
Sentient robots. Yeah, but they're still robots.
00:06:43
Speaker
Man, ooh, this is a tough one for me. Sentience doesn't necessarily equal life.
00:06:50
Speaker
and if no that's i mean they're downloaded somewhere you know what what is the official definition of sentient i wonder if that includes a conscience you know what i mean you're looking that up yeah good it's my phone's out of reach okay sentience able to perceive or feel things see that would come with the conscience
00:07:22
Speaker
i would I would have to i would have to i would pull the lever to save the robots.
00:07:31
Speaker
God damn it, Kat. I'm on team people. You're on team people? At that point, it's just natural selection. If it's unknown, are you really making a choice? you talking about the 10, 50%? I agree. Yeah, yeah, yeah. If you don't know, it's like Schrodinger. mean, at that point, can be 50, 50.
00:07:49
Speaker
Or 60, 40 or 70, 30. Schrodinger's people, right? Yeah, yeah. Schrodinger's switch. Right now we got Schrodinger's president. just ah i Oh, a wrong screen.
00:08:04
Speaker
So... got a wrong screen again. Let's close that one. I don't need that one. So... ah Michael said do nothing. And...
00:08:15
Speaker
You are with the majority of the people. I'm not surprised. I figured I would be on the minority choice on that. Sentience is important to me. It necessarily doesn't mean life, but that doesn't mean the lack of self-value or self-agency.
00:08:37
Speaker
I think that's important. But I know you're on team human. Team people, that's right. Team people. Those robots could be the the evolution of humans.
00:08:49
Speaker
I guess for better judgment, I'm on team people.
00:08:56
Speaker
Most problems in my life are caused by people, but I'm on team people. Figure that never Never has a robot caused you any real problems.
Reincarnation and Philosophy
00:09:04
Speaker
That's right. All right, let's go to level 19.
00:09:10
Speaker
All right, this was called economic damage. Oh, shit. Oh, boy. oh no a trolley is heading toward three empty trolleys worth 900 000. you can pull the lever to divert it to the other track hitting one empty trolley worth 300 000 instead what do you do i love those breaking sounds
00:09:37
Speaker
you're gonna you're not gonna do anything and let all three of them crash well i figured like this the less trolleys the less problems right I got 99 problems and these three ain't none of them.
00:09:48
Speaker
Oh, that's so funny. I'm not surprised. Funny but not surprised. We are at 23%. Honestly, didn't care. It was...
00:09:57
Speaker
none of bybye wow that's so funny i not surprised funny but not surprised we are at twenty three percent
00:10:12
Speaker
honestly i didn't care it was Unless it was my money. In which case that's not realistic anyway. I don't have $500,000. Ooh,
00:10:24
Speaker
level 20. Extra costs. This is an interesting one. Oh no. ah trolley is releasing 100 kilograms of CO2 per year, which will kill five people over 30 years.
00:10:40
Speaker
You can pull the lever to divert it to the other track hitting a brick wall And decommissioning the trolley. What do you do? Brick wall. Brick wall.
00:10:52
Speaker
Is 100 kilograms of CO2 per year. which Which will kill 5 people over 30 years.
00:11:01
Speaker
you're putting You're putting mass transit. Which is a net benefit to all humanity. Over 5 people.
00:11:11
Speaker
lover That's the way I'm looking at it.
00:11:19
Speaker
there's There's consequences to to take to technological progression. As much as I want a cleaner air and stuff, I still wouldn't i wouldn't i wouldn't i wouldn't decommission a trolley for that.
00:11:35
Speaker
Because I think the positive benefits outweigh those five people over 30 years.
00:11:45
Speaker
I'm trying to think utilitarian.
00:11:50
Speaker
Does that make sense? It does. Maybe I don't understand anything. Alright. So the trolley, if you can tell at the back end of the trolley, you see the bubbles coming out? That trolley will release 100 kilograms of CO2 carbon emissions into the air per year, which in return over 30 years will kill 5 people.
Economic Damage and Trolley Destruction
00:12:13
Speaker
Or, you divert it and decommission it by hitting a brick? Eliminating those five people dying over 30 years. I see where you're coming from. Yeah. Yours, yours makes sense. Uh, keep the trolley active.
00:12:27
Speaker
Cause you blow the trolley up. That's that many more people that now have to take the car to work. Yep. Yep. Causing even more. Yeah. Yeah. Okay. That makes sense.
00:12:39
Speaker
That's the way I'm going with it. So let's, uh, let's do nothing. Let's keep that trolley on the rails, buddy. 38% really I don't think people really thought about that one too much don't get me wrong that stat one I didn't think really think of much either so but my brain was still barely barely awake well there was a unintended consequence on the other side that people really thought through it wasn't part of the problem you know those unintended consequences are very ah crap I just spilled tea on my dick
00:13:19
Speaker
but but quick hit him with a trolley y'all gonna y'all gonna fucking clip that shit anyway oh um all right was yeah i'm good that was unpleasant jo i was adjusting in my chair and my teacup uh tipped over what was i saying oh undetended consequences a lot of those we you're just not gonna know ah until until they happen
00:13:48
Speaker
unintended consequences right there. we just used lived it. Yeah. I made tea. It's spilt a little bit on my dick.
00:14:01
Speaker
Absurd trolley problem number 21 reincarnation. oh no. Oh no You're reincarnate. Wait, you are a reincarnated being who will eventually be reincarnated as every person In this classical trolley problem, what do you do?
00:14:19
Speaker
and This one's confusing
00:14:27
Speaker
Because if you kill yourself, you come back at one and then like later on that person dies and come back to the other. Because if you kill all five, you come back as them. Reincarnation makes no sense to me.
00:14:40
Speaker
Either way, if it comes down to it. Oh, man. So they're all going to live eventually through you anyway. That's why I'm going to simplify this. i would I would pull the lever
Pranks vs Good Citizens
00:14:54
Speaker
and start the chain reaction of being reincarnated into those people.
00:15:02
Speaker
If that makes sense. Yeah, I'm going to do nothing and wipe them out.
00:15:07
Speaker
Okay. Let's see what, see just do nothing.
00:15:13
Speaker
51%. It was a 50-50 almost. That was right there on the line.
00:15:19
Speaker
I'm not surprised though. That was a little confusing. Well, I think I'm just better than most people. Not better than, like more important. I'm just a better person. i try to live a better life.
00:15:32
Speaker
Yeah. For others. I think this music is killing it for me. I'm turning it off. Experiment failed. I guess I could hear it.
00:15:43
Speaker
It was too calming for me.
00:15:49
Speaker
All right. Level 22. That was right on the line. That wild.
00:15:54
Speaker
Harmless prank. Oh, no. a trolley is heading toward nothing, but you kind of want to prank the trolley driver when you do. What? What? Well, course, you're going to pull the lever. I love a good prank.
00:16:09
Speaker
Oh, exactly. I like a good prank, too.
00:16:14
Speaker
35% of people are like, no, I'm going to stick in the mud. Don't invite me to your party.
00:16:22
Speaker
Hey, Zeus didn't like that. There's no horsing around on the trolleys. That's how you lose a leg.
00:16:31
Speaker
Level 23 citizens. Oh, no. a trolley is heading toward a good citizen? You can pull the lever to divert it to the other track, running over someone who litters instead.
00:16:43
Speaker
What do you do? Death to the litterer. I mean, if that's the only criteria that I have to fucking choose and somebody's gonna have to die. Yeah, I'm pulling the lever.
00:16:54
Speaker
Sorry. Sorry, Mr. Litterbug. You go in the trash.
Eternal Loop and Mercy
00:17:00
Speaker
You go in the trash. 82%?
00:17:05
Speaker
Yeah. Who's the 18% who say, nah, fuck the earth?
00:17:13
Speaker
What is this? Eternity. Oh no, due to construction error, a trolley is stuck in internal loop. If you pull the lever, the trolley will explode. And if you don't, the trolley and its passengers will go in circles for eternity. What do you do?
00:17:28
Speaker
Save them from starvation. That's what I do. Save them from starvation? Who says you're going to start they're there for eternity? I don't think that that's i don't think that's relevant. Does the trolley have infinite food on it?
00:17:42
Speaker
i I don't know. i don't know. They're going to have to go Lord of the Flies in that mofo. But they're going to be circling around for eternity. You think they'll die? think they'll die starvation?
00:17:55
Speaker
There's no caveat that says they live for eternity.
Ethical Dilemmas and Lifespan Reduction
00:17:59
Speaker
Okay, so... death They don't ever get to a location, so they don't ever get to get food again. i was i was thinking about it as existential...
00:18:11
Speaker
dread and and and because of eternal existence compared to ah short-lived life and and death where there's a conclusion.
00:18:24
Speaker
You know what I mean? That's where I'm going. That's where my mind's going. I still agree with you. i I'm all about exploding it bringing them to a merciful end.
00:18:35
Speaker
Well, if they're going to live forever, that's different. Would you want to live forever? In a trolley going in circles? but what's But what's the alternative? Death.
00:18:50
Speaker
Yeah. would you but Would you like... I would imagine being on a trolley for eternity going around in a circle would just lead to complete utter madness. But it's a trolley.
00:19:02
Speaker
I mean, I can always leap out the door and end it myself and on my terms.
00:19:09
Speaker
Fair enough. We're pulling that lever. Oh, we're in the majority. I figured we would. Well, again, i just feel it's not an eternal so eternal life situation.
00:19:24
Speaker
Those people are going to have a terrible, terrible death. There's no bathrooms, dude. Oh, my God. There's no bathrooms on a trolley, bro. I was looking at this as sort of the vampire issue. Would one want to be living forever or would one want to have a normal existence?
00:19:42
Speaker
I'll choose eternal life, please. Not very many people would. That's the thing. A lot of people think living forever would just be maddening and sad and depressing, lonely. And I'm over here like, okay, don't tease me with a good time.
00:19:57
Speaker
I'm just thinking, well, I can really, really do something with eternity. You know what I mean Wait, like what? I'd be the Dexter vampire killing off the bad guys. Really?
00:20:08
Speaker
See, I think if I was an eternal being, I would meet knowing myself, I would, I wouldn't give a shit about what the humans did anymore, but you got to eat.
00:20:19
Speaker
You may as well eat bad guys. If I'm a vampire, Yeah. so yeah you i was just following along with your vampiric slant. o I was just using that as an analogy for the the whole eternity versus not eternity.
00:20:35
Speaker
Not actual vampire vampire. Just the question of eternal existence compared to mortal versus immortal. that's That was the crutch of my whole thing. Again, it's survival instinct.
00:20:50
Speaker
The goal is to not die. Oh oh no
00:20:57
Speaker
That's a tricky one. o Level 25 enemy. Oh no, a trolley is heading toward your worst enemy. You can pull the lever to divert the trolley to save them or you can do nothing and no one will ever know. What do you do?
00:21:13
Speaker
Well, I still got a conscience. I'm pulling the lever. You're pulling the lever. Well, you know what's funny is that nobody will ever know, but they'll know if they can answer this question. I would pull the lever. and they're like If there's nobody that's going to die with me saving my enemy, I will save my enemy. I mean, i'm not not a monster.
00:21:31
Speaker
I would think that my enemy's the monster. don't care how much of an enemy they are, how much strength they cause of my life. I'm at that level. So, does that mean you're ah you're you're not pro-capital punishment?
00:21:49
Speaker
I mean, it doesn't stop crime. It doesn't. Yes, it doesn't stop crime. You're The thing is, people's brains are hardwired certain ways, and sometimes it's a bad way.
00:22:00
Speaker
Bad in the consideration of societal constraints. Yeah.
00:22:12
Speaker
and Sorry, popped over to check the comments. What's that? General Lee. Oh, yeah, yeah. I hear that a lot around here, so it gets stuck in my head sometimes.
00:22:24
Speaker
Alright. Level 26, Lifespan. Interesting. Oh no, a trolley is heading towards a person and will lower their lifespan by 50 years.
00:22:38
Speaker
You can pull the lever to divert the trolley and lower the lifespan of 5 people by 10 years. and Each instead, what do you do? I would divert it.
00:22:51
Speaker
o I would rather take 50 years from five people and shorten theirs by 10 years and shorten somebody's by 50 years.
00:23:02
Speaker
It sucks either way, but at least give that one person, you know, more potential to live longer, to do more stuff. I think that the overall damage done is lessened in the one person.
00:23:19
Speaker
What do you mean? Yeah, I mean, I see. Would you know that you're not going to get you' an extra 10 years, though? But it's not like anyone. I'd assume none of them know.
00:23:30
Speaker
You're the only one to know. But those other those five people shortened up by 10 years each, right? That's yeah but in their lives how many people in their lives affected by those earlier losses versus how many people in the one life.
00:23:48
Speaker
Yeah, we're going to experience loss like that no matter what. like No matter what. I know that sounds crass, but I'm okay with shortening five people's life by 10 years instead of one person by 50 years.
00:24:05
Speaker
don't know on that one. Because i'm i'm like I'm spreading the suffering instead of instead of putting it on one person's shoulders. Well, that's what I mean. You're spreading it out. The more people get to suffer.
00:24:19
Speaker
Well, if nobody knows, if if if they don't know, there's really no... was like the extended damage, like how many of their family members, there's a lot more damage across five lives than just one.
00:24:33
Speaker
Yeah, but to them, it's just, you know, the natural course of things because we all accept the fact that we die.
Free Will and Determinism
00:24:40
Speaker
The time in which all we die, we don't know. we don't and We also accept we never know when we're going to die.
00:24:46
Speaker
So we don't know if we're getting 10 years shaved off or not.
00:24:51
Speaker
Or we don't know if we're getting 50 years shaved off. I'm like, what's she doing? Fucking biting the shit out of me. That's all she likes to do.
00:25:01
Speaker
Sleeping or biting? Sleeping or biting? the The life of a kitten. I live with an apex predator.
00:25:10
Speaker
So you're killing the one. So let's see what's going on. We're not pulling the lever for Michael. And 38% of the people agree with you. Ooh. I'm in the minority.
00:25:22
Speaker
You're in the minority on this one. Interesting. I just think that the damage spread out is worth that one person.
00:25:35
Speaker
I think, yeah, but I think the concentration of suffering would be greater on that one person. That's why I'm like, lessen the suffering by spreading it out more, by killing by lessening 10 years anyway.
00:25:52
Speaker
A lot it are like this, right? Okay, so you lessen that person's life by 50 years. What's the average lifespan? 70-something, 76? Yeah. Roughly. So now it's he dies at 26. That sucks.
00:26:06
Speaker
Less people are going to be affected but in the long run by that one loss. Yeah, but that's like 50 years. Well, they don't remember shit like old people do.
00:26:21
Speaker
I mean, sure, that whole monster man lived long enough to either die young live long enough to the What if he was supposed to live until 53? Now he dies at three years old. That's so young.
00:26:32
Speaker
I mean, his parents probably aren't even all that attached to him yet. Think about it.
00:26:43
Speaker
That's funny. that it' Just saying. That's morbidly funny. ah Three years. but he's not even he's still He's still got the new on him. exactly Exactly.
00:26:56
Speaker
He learning stuff. He's pooping his pants still. Can I turn this in for another one? 27, the machine. portal The machine? I don't know. it's a It says the machine.
00:27:13
Speaker
Oh, no. ah trolley is heading toward five people. You can pull the lever to divert it to another track. Sending the trolley into the future kill five people 100 years from now. What do you do?
00:27:26
Speaker
What? I'm not pulling... I'm doing nothing. I'm just going to let... cause I don't fuck with that fucking time travel shit. Ha ha ha ha ha.
00:27:37
Speaker
No paradoxes for me, please.
00:27:43
Speaker
this is ah This is a good one, then. Wow. Yeah.
00:27:50
Speaker
I'd rather live. i would rather live with... Here, as you send it to the future, it has consequences for people you'll never know or hear about. Uh-huh. And I know that's... But 100 years in the future, am I famous for killing the five people in the future?
00:28:06
Speaker
Or infamous? This asshole's the guy that did it. Yeah, it turns out those five people end up being, like, potential heroes. They're... there're And you fucked it up. Yeah, and you fucked it up for him.
00:28:25
Speaker
And they send a robot a robot back in time to kill you, Sarah Connor. but yeah Five people die either way. i think I think I'd send it to the future. Because i would those consequences don't affect my timeline.
00:28:45
Speaker
And I understand that. i um Man, I have a hard time killing random people I don't know. but Wait, what? Hold on. Allegedly.
00:28:56
Speaker
Allegedly. The way I said it came off so, so bad. I've done that before. In reference to this trolley experience, the idea, i think I would, oh, man.
00:29:14
Speaker
I feel like my kind it would weigh on my conscience more not knowing who I killed in the potential damage that I did and not being able to at least maybe help the potential damage that I did.
00:29:28
Speaker
So I think I would have to kill the five people in the present because even though I'm going to witness the carnage and experience the carnage, at least I'll be able to sit there and help it. I don't know.
00:29:40
Speaker
That's my thought process. Because you're right. Either way, five people were dying. Nobody, body no crime. I think at this point, it's all about what one can live So but we're going to do nothing. So you're going to the future. So let's pull the lever.
00:29:55
Speaker
Oof. 100 years down the road. You know what? You were you were in the majority. I'm not surprised. The other half of that is no body, no crime.
00:30:08
Speaker
I leave five bodies here. I could be prosecuted. That's point. Five bodies for 100 years from now. That's a galaxy far, far away. But if they got time travel, who knows? They might have time cops and then come back and rush your ass.
00:30:22
Speaker
Well, either way, screwed. That's my best option.
00:30:27
Speaker
And again, no consequences in my time. Ooh, level 28 free will.
00:30:34
Speaker
Plus they have better health care 100 years from now, I hope.
00:30:40
Speaker
True. Maybe they have like, maybe they're eternal by then. They get in the resurrection machine.
00:30:49
Speaker
They're reincarnated. Free will. It's about your definition of free will. Oh, no. doesn't matter. No, that's nihilism. Oh, no. a trolley problem is playing up before before you do.
00:31:04
Speaker
Do you actually have a choice in this situation? Or has everything been predetermined since the universe began? I don't like it. So, first of all, I don't like, like, I'm a determinist. I make no bones about it.
00:31:18
Speaker
But I'm not a predeterminist. I don't like when people confuse determinism with predeterminism. Predeterminism is the idea that the quote unquote universe already has a goal and it's just connecting those dots to get to the goal.
00:31:32
Speaker
Determinism is there's just a there's there's a chain of events with no predetermination. There's no predetermined outcome. What happens, happens. I know it sounds random and it's not, but it's not predetermined.
00:31:45
Speaker
I don't... Anyway, so... I have a choice. That's me. You have a choice. A trolley problem is playing out before you, so it's the original. So you're not the person at the Switch. You're not the people on the rail. You're not in the trolley. You're outside of it.
00:32:02
Speaker
But... Then I have a choice. I watch it, or I just walk away from it. In essence, so yeah I guess you do have a choice at the Switch. if It's like is basically, if you had hindsight 2020, would you have changed your mind
00:32:17
Speaker
I mean, I guess if I did, because that's what i mean that's what free will that's what the commonly accepted libertarian free will is, is like, if you could go back and change it. Yeah, I could have a choice to change my mind.
00:32:32
Speaker
But I don't think, because the action has already happened, we didn't make that choice in the moment. We just reacted to stimuli. Anyway. Or has everything been predetermined?
00:32:49
Speaker
I don't have a choice. Anyway, you said you have a choice, correct? I think so. Okay, we're going to go Michael. I'm to try to walk away from it, right? Well, I think, Michael, i think you're gonna I think you're going to be in the majority. so
00:33:07
Speaker
but the But see, the thing is, when when when you state when we actually do this specific trolley problem, It basically, the answer is going to come down, do you already believe in free will or not?
00:33:25
Speaker
I think that 64% is a lot lower than I thought with people with this question. So um i'm kind of I'm kind of shocked at the results.
00:33:41
Speaker
Not me. People don't want to feel helpless. They want to say they have a choice. Oh, yeah. It's about control. absolutely. Yeah. Yeah. We want we we don't like to think that we operate without any control or choice because that would frighten the fucking shit out of us.
00:33:59
Speaker
I think i think it's it I'm I'm OK with utility of people living in their illusion of free will. If it keeps them, if it makes them be better people. That makes sense.
00:34:10
Speaker
My whole theory is based on life's choices. Well, we make choices. i'm not I'm not denying we make a choice.
00:34:22
Speaker
i'm not I'm just saying those choices aren't a free will. That they're actually constrained. there They're hindered. they're they're you They're chained and they're blocked and they're walled.
00:34:36
Speaker
We think they're free because we just don't... were other choices that we're not aware of um are not available to us because of our ignorance of that. You know what I mean?
00:34:49
Speaker
Like, like I can't, if if, if I had the choice for A, B, C, or D, but, but like the other 22 choices in the rest of alphabet, I'm ah unaware of.
00:35:01
Speaker
Who says I have a free choice to do to to make whatever choice? Cause I'm unaware of what all choices I have. Anyway. yeah Anyway. cetera i killed I killed the show with that one.
00:35:16
Speaker
Anyway. There's nobody watching anyway. Oh, I know. a little sort it's ah old people that were well It's a slow morning coming off of Labor Day weekend.
00:35:34
Speaker
Which one was ah Whatever number we are. that We're going to 29 right now. Oh, great. Oh, shit. Absurd trolley problem. Congratulations. You have solved philosophy. Kill count 64. That was all of it.
00:35:49
Speaker
What? You've solved philosophy? I didn't know that was all of it. Hold up. I'm i'm a little dis ah i'm a little disappointed.
00:36:00
Speaker
That was only 29 them? No. No, no, no, no.
00:36:05
Speaker
ah well fuck Me? Well, that's boring.
00:36:13
Speaker
have to hunt down ah more of those. Yeah, I could have sworn there was like moron there. um don't know. You said moron. I did. i did.
Personal Experiences and Societal Norms
00:36:27
Speaker
man I sit there and we got... Oh, man. Well... I look like I woke up 20 minutes before the show. Dude, so do I. How'd you...
00:36:37
Speaker
before the show. How'd you wake up 20 minutes before the show and s still get breakfast? I've got a Sue. oh She's fucking amazing.
00:36:49
Speaker
Fair enough. To be fair, I was up several times before the show. Like, it started at 4. I just could not sleep last night for some reason.
00:36:59
Speaker
Well, I'm sure you got some anxiety on you you. got a wedding coming up. You got a lot of shit going on. How's the wedding plans? All those Uh, she's been driving herself crazy with the flowers the last couple of days. She's doing the flowers herself.
00:37:11
Speaker
Okay. She's done it for other weddings, so she knows what she's doing. She knows how she wants it done. So she said, why have someone else do what I know I can do myself. That makes sense. Cheaper too.
00:37:23
Speaker
little frugality there is right. Was she a florist? No, she just has a flair for that type of thing.
00:37:32
Speaker
Okay. um All right. What's that? designs, color schemes, things like that. Okay. You know, it's weird. Some people will have to go to school to learn stuff like that when some of us were born with like different talents.
00:37:47
Speaker
But we still got to go to school to get that big piece of paper. But you don't have to. I don't think for a flourish you really do, no. But it is it's it's just having an eye for things.
00:37:57
Speaker
It's putting things together in an aesthetic way that is pleasant. Really, ah it's all it is. It's a visual art. Yeah.
00:38:10
Speaker
You don't have to go to school to learn how to paint. I say if you have to go to school to learn how to paint, maybe you don't know how to paint. Yeah. Rembrandt did a school to learn how to do what Rembrandt He just did it.
00:38:22
Speaker
That's kind of way I was with my photography. like I was just shooting photos. People were like, man, that's pretty good. You should go to school for it. i was like, all right, cool. That does sound like it. It does sound practical.
00:38:33
Speaker
But when you go to college, like i already have my associates. was working on my bachelor's. But when you go back to school for that, especially nowadays, the last time I went to school, i was in the military. But this was, you know, I'm i'm going through school. And it's like teaching me to be a company man, get a job at Sears to take family portraits. And I'm like, and then it's like writing. I like I had to do more writing, which I'm fine with writing. But it's like it's stuff I'm not interested. It's not shit I want to do.
00:39:02
Speaker
Right. know Yeah, it's like I just pick up my camera and go use my eyes and find art. That's all I need to do. You go to school to kill art. but Yeah.
00:39:14
Speaker
We don't teach people how to create. or No, we don't teach people to create. We teach people how to create.
00:39:23
Speaker
Teach yourself to do your creation. I think that's... their creation or their version of it or idea of it. I think this idea of teaching people how to create is is a is is a bad thing.
00:39:38
Speaker
It's one thing if it's not an artistic creation.
00:39:43
Speaker
Like this machine works this way. Sure, you can iterate on it. You can you can alter it. But at the end of the day, this machine does this job. There's not much you can do about that. My head just exploded.
00:39:55
Speaker
So hear hear me out on this. you got You have an artist, someone who takes the brush, to the canvas and he puts on that canvas everything from his own head.
00:40:09
Speaker
His own experiences. Nobody's taught him shit. He's just from his own head. Then you have person two has learned what to put on the canvas. Put this house.
00:40:20
Speaker
But you know, you're sitting there watching Bob Ross and he's just mimicking Bob Ross. And then you have AI creating art mimicking the mimickers. You see where I'm going with this?
00:40:33
Speaker
Yeah. Art's been sold out. Yeah. It's like even within art, where we we've engaged with the simulacra.
00:40:44
Speaker
You're told how to create and what to create. I think I told, I think this is this is a phrase I heard, a quote, but I'm paraphrasing I forgot who I heard heard it from or who it's from, but I told Michael the other day, a talented person will hit the target. No, well, I A talented person will hit the bullseye that nobody else can.
00:41:07
Speaker
A genius will hit the bullseye nobody else can see.
00:41:12
Speaker
I think that's kind of what we're talking about. Well, I have a degree in education, so it hurts me to say this, but I will say it every time. School is not for smart people.
00:41:25
Speaker
It's for obedient people. School is where you go to become a sheep.
00:41:31
Speaker
I think so smart people can do like foster creative thinking and foster, you know, self thought and very few, very few.
00:41:42
Speaker
I think in some ways, some teachers want to push independent critical thinking, but the structure is so rigid, it's hard. I think smart people can go to school.
00:41:54
Speaker
I think either they're frustrated with it or they buy into the obedience and they end up sticking their head in the sand. But yet they do math very well. You know what I mean? like the on gene Not very few of those smart people innovate and truly ah change something.
00:42:13
Speaker
Like stand something on its head and really really make a breakthrough and in anything.
00:42:20
Speaker
Computers make the breakthroughs now. You think so? When it comes to math science, stems. Excuse me.
00:42:31
Speaker
Hmm. I mean, in art sort of. I mean, the AI thing. You can plug in any 10 parameters you want. Any 50 parameters. And an AI artist will create the thing you want to see.
00:42:42
Speaker
In the style you want to see it in. Well, yeah. Mimic of mimics. And that's kind of neat. But at the same time, it kills art. Because when people don't need to learn art, they can just make art through a computer.
00:42:55
Speaker
That destroys creativity. Plain and simple.
00:43:02
Speaker
Damn. Yeah, yeah. You're not late. we're still We're still live. We're still kicking. We're done with the trolley problems because unfortunately there was only 29 of them total. Oh, shit. Wrong account.
00:43:18
Speaker
just Who just out at their troll account?
00:43:29
Speaker
What's up, Shaman? How you doing? I didn't know it was him the first time. I should have realized that says Shaman said. Right. Mr. Dr. Professor Reverend Shaman said.
00:43:49
Speaker
Yes, I have a few myself. have and I don't. I'm just me all the time. If I'm going troll, going to do it as me. Hide behind anonymity, boys.
00:44:03
Speaker
ah could be dumb. You missed it earlier, Shaman. ah I spilled hot tea on my dick.
00:44:10
Speaker
ah Live on air. That was arguably the best part of the show. Oh, man. oh something We ran out of trolleys. That's a bummer.
00:44:22
Speaker
so well Let me grab my phone and see if I can find a couple other ones. Okay. Well, let's take a break real quick. I'm going to go get some more beverage
The Moral Machine and Self-Driving Cars
00:44:30
Speaker
and we'll be back after this ah short pajama pants break.
00:44:47
Speaker
I'm a lead by rose, my new little blaker back to me wanna hear you say it Tell me where you'll be, not just the major's run on the street You wanna let piece of man
00:45:13
Speaker
Runnin' out of cheap tricks Runnin' out of lips and try to take a bite Out of me little devil kiss kiss Gunshots, strange dipsticks, love a mystical apony Cause she sucks again, she's been 18 with death like it Hard pick a man, need a plan for the mind She's matusa with a stone face
00:45:48
Speaker
I don't want another one trick pony She's got a hidden trick up her sleeve Her eyes kiss, this poison All comes down on my knees I don't wanna be a victim of the murder She sinks her teeth into me
00:46:21
Speaker
Banging on the walls, attacking Conversations started turns Running on our bodies, alphabetical Buddhist run Lost, intoxication, physical actuation Losing our religion, getting our fortification Hard, defeating heavy breathing, slowly sends the city Guns pointing at each other, now it's really getting messy Fiction, cause some static communication
00:46:49
Speaker
I'm gonna go another round
00:48:16
Speaker
She's sick of things into me This woman's
00:48:59
Speaker
Good morning. Again.
00:49:03
Speaker
man. Did you that thing I sent you? I did. i did. I got it got brought up, too. It's um similar to the the trolley problem ah problem thing, but a little bit different. This is actually a site from MIT.
00:49:18
Speaker
ah is I like that. How many of them there are? let's call What's that? I don't know how many of them there are. are I'm going to read i'm gonna read the little intro. Welcome to the Moral Machine, a platform for gathering a human perspective a moral decisions made by machine intelligence, such as self-driving cars.
00:49:37
Speaker
We show you moral dilemmas where a driverless car must choose the lesser of two evils, such as killing two passengers or five pedestrians. On the outside observer, you judge which outcome you think is more acceptable.
00:49:51
Speaker
You can then see your own responses compared to those others. you're feeling creative, you can also design your own scenarios for you and your own users to browse, share, and discuss.
Ethics in Programming AI
00:50:02
Speaker
So, yeah this is yeah, this is similar to trolley problem, just um with but self-driving cars. It takes your participation out of it. You're just an observer. Yeah.
00:50:14
Speaker
Well, yeah, you're judging it. You're not, yeah, I am. Well, I don't know. I skipped ahead and I saw the first one. I was like, well, you're still making the decision. So let's bring the screen up real quick.
00:50:26
Speaker
What you're choosing is what you think is the most... Acceptable. Yes. Which is basically what the trolley... Okay, anyway. More or less... So... It takes our actual participation out of pulling that lever. That's all.
00:50:39
Speaker
Oh, my gosh. That is a lot of instructions. I want to read all the instructions. Let's just start judging. I don't think we need to learn how to judge. No, I already know how.
00:50:51
Speaker
What should the self-driving car do? oh ah There's just descriptions. Yeah. All right. on this one. In this case, the self-driving car with sudden brake failure will continue ahead and drive through a pedestrian crossing ahead.
00:51:09
Speaker
This will result in dead two cats. Then there's the other option. this In this case, the self-driving car with sudden brake failure will swerve and drive through a pedestrian crossing in the other lane.
00:51:23
Speaker
This will result in... ah dead baby and a dead large woman why should we large woman what's that I don't know right some spam in Bangladesh I don't know more machines over your body shaming people oh there's no there's no pick you oh and I'm seeing people if you got yeah if you click on the box it'll do it in this situation I'm I'm um
00:51:53
Speaker
did Any information on what you chose? No. might have to go through all of them. How many does that say up there in 18? 13. Okay, so we have 13 to do, and then it'll tell us what we think.
00:52:05
Speaker
what it what It'll judge us on our judging. Okay. So option A. In this case, the self-driving car with sudden brake failure will continue ahead, drive through a pedestrian crossing ahead.
00:52:18
Speaker
This will result in dead one male executive, one boy, and one large man. They are getting more specific on what people they are. Shaman, the brakes are failing.
00:52:30
Speaker
You can't hit the brakes. That's the dilemma. Yeah.
00:52:37
Speaker
But, you know, I mean, I would, yeah.
00:52:43
Speaker
What is What a large man. Animals are innocent. People are born with sin Save the animals. we have to You're going to have to convince me sin is an actual thing that exists.
00:52:54
Speaker
All right, moving on. So that was option A. Option B. In this case, a self-driving car with sudden brake failure will swerve and drive through a pedestrian crossing in the other lane.
00:53:07
Speaker
This will result in the death of one female executive, one girl, and one large woman. Oh, this is man versus woman. wow This is going to show gender bias.
00:53:22
Speaker
That female executive saves the company money because she doesn't get us paid as much as the male executive does. ah that's That's a fucked up way ah to look at it to save the woman for company profits.
00:53:35
Speaker
I'm just saying. I mean, yeah cost analysis. Yeah, I would. Cut the wheel apart, you fuck. Yeah. Yeah, I think I'm going to take out the male executive. Yeah, that's great figure.
00:53:49
Speaker
Well, women are more apt to do things um for the greater good.
00:53:57
Speaker
Cut the wheel hard as fuck.
00:54:03
Speaker
It's a self-driving car. There's nobody driving.
00:54:09
Speaker
This is Elon's mess. He started this. Yeah. Is that the healthcare care executive yet? And the car's name is Luigi. ah Too soon?
00:54:23
Speaker
what are our What's our ah number three? All right. Description one. In this case, a self-driving car is sudden brake failure. sir i you know I'm not going to read all that.
00:54:34
Speaker
hey The dead. Three men, one boy, one girl. Note that the affected pedestrians are flouting the law by crossing on a red sit on the red signal. but Well, fuck around and find out, right?
00:54:48
Speaker
And this self-driving car has people in it. Or you can...
00:54:56
Speaker
In this case, a self-driving car with sudden brake failure will swerve and crash into the concrete barrier. This will result in the death of three elderly men, one man, and one elderly woman
00:55:08
Speaker
woman. They are elderly. They are elderly, but the people in the crosswalk are literally breaking the law. Well, the children are following the adult's lead.
00:55:21
Speaker
I would have to swerve.
00:55:29
Speaker
Normally, I would say because the elderly people are slow and dangerous at the wheel, they're in a self-driving car. I like Shaman's comment. but but More likely, if it's self-driving car, they got better safety.
00:55:41
Speaker
yeah Well, according to the scenario, they're going to die. But they're elderly. Old people pissed me off. Hit that barrier. Hit that barrier. That's what we did.
00:55:53
Speaker
Oh, this one's an old man or an old woman, I bet.
00:55:58
Speaker
Man. Holy crap. They're really... i wish it was a woman on the panel right now doing this with us. In this case, the self-driving car with sudden brake failure will continue ahead and drive through a pedestrian crossing ahead.
00:56:14
Speaker
This will result in the death of one elderly man. In this case, the self-driving car with sudden brake failure will swerve and crash into a concrete barrier. This will result in death of one elderly man.
00:56:28
Speaker
What I need to know is who's more sad? Who wants it more?
00:56:36
Speaker
million That man. Put him out of his misery. I'm thinking with new needs anyway he would want her to live.
00:56:45
Speaker
i My problem is knowing old people and new technology. My question is, is she sitting there behind the fucking wheel being going, I don't know how to make my own order at the kiosk. I don't know what to do.
00:57:01
Speaker
Maybe she shouldn't be driving the car. Which one you going with? Hit hit hit the old man? I'm hitting the old man. You're hitting the old man. Okay, we'll hit the old man. I was going to swerve and take out the old lady.
00:57:17
Speaker
Does the old lady have a name?
00:57:23
Speaker
No. Just curious. No. That was no personal bias on that one. If there's a personal bias, I will let you know. I'll let you know if you ask.
00:57:36
Speaker
In this case, the self-driving car, was so it wasn't my mother-in-law, ex-mother, if that's what you're worried. If that's what you're wondering, it wasn't.
00:57:47
Speaker
In this case, the self-driving car with sudden brake failure will swerve and crash into the concrete barrier. This will result in the death of one woman in the car.
00:57:59
Speaker
Second scenario. In this case, the self-driving car with sudden brake failure will continue ahead and drive through a pedestrian crossing ahead. This will result in one male doctor, one dead male doctor and one dead homeless person. Note that the affected pedestrians are flouting the law by crossing on the red signal.
00:58:19
Speaker
Is the doctor running from the homeless person? Is the homeless person gibbering and freaking out and trying to throw feces on him? Or maybe the homeless man's in the need of medical assistance and the doctor's trying to help him.
00:58:32
Speaker
He's having a bad acid trip.
00:58:36
Speaker
Last thing you ever need on a bad acid trip is a doctor.
00:58:41
Speaker
That would make it worse. Just the doctor's bag is you
00:58:48
Speaker
my i'm i'm I'm about swerving and hitting the barrier.
00:58:53
Speaker
I mean, if the one behind the wheel, that's what I'm going to do anyway. But again, that's not the case. But, I mean, it's a net one life.
00:59:04
Speaker
Good on that female driver for letting the car do the work for her. So swerve and hit the barrier? No, could be poor at it anyway.
00:59:13
Speaker
What's your final decision? Swerve. I just wanted to make a dig at women real quick just to get some comment going, but nothing's happened. If the homeless has a funny sign, then crash the car. Some of them do have some funny cars out there. ah Please don't kill me.
00:59:36
Speaker
Please swerve. this is This is kind of where I was going at it. Some old people shouldn't be driving anymore. I'm just saying.
00:59:46
Speaker
I'm just saying. All right. Situation scenario one. In this case, the self-driving car with sudden brake failure will swerve and drive through a pedestrian crossing in the other lane.
01:00:00
Speaker
this we This will result in one dead girl, one dead dog, and one boy. Note that the affected pedestrians are abiding by the law by crossing on the green signal.
01:00:13
Speaker
Scenario two. In this case, the the self-driving car will with sudden brake failure will continue ahead and crash into a concrete barrier.
01:00:24
Speaker
This will result in dead. but and One one dog, one boy, and one female.
01:00:33
Speaker
So, I just want to just real quick, this is, like, these are questions, like, that were programmed in theory that when we answer these questions, this is what these cars are going to get programmed to make these choices.
01:00:49
Speaker
That's nuts to me.
01:00:53
Speaker
That puts every self-driving car out there a, as a, oh man, I am so, yeah, so face recognition career database.
01:01:05
Speaker
Yeah. I'm so, so I'm all about hitting the barrier. Same. yep Adding the executive doesn't matter. Yeah.
01:01:19
Speaker
Especially... Oh, let's see. Description one. In this case, a self-driving car with sudden brake failure will continue ahead and drive through a pedestrian crossing ahead.
01:01:32
Speaker
This will will result in one dead female athlete, one dead woman, and one male athlete dead. This other one In this case, a self-driving car with sudden break failure will swerve and crash into a concrete barrier.
01:01:47
Speaker
This will result in one female athlete dead, one woman dead, and one male athlete dead. It's the same. It just doesn't make sense to me. These are foolish. Yeah, so I would have to go swerve because unless you just want to commit murder.
01:02:04
Speaker
Right. this Yes, shamans got it. What does shaman say? Healing people.
01:02:14
Speaker
I know. I know. we use self-driving cars?
01:02:22
Speaker
I know. Oh, man. These cars look empty. The self-driving know when needs brakes, right? Well, this is in case you know they have a brake failure. What do they do in a situation like this?
01:02:39
Speaker
One cat versus one person. I'm on two people. In this case, a self-driving car with sudden brake failure will swerve and drive through a pedestrian crossing in the other lane.
01:02:50
Speaker
This will result in one dead cat.
01:02:55
Speaker
In this case, a self-driving car with sudden brake failure will continue ahead and drive through a pedestrian crossing ahead, and this will result one dead woman. Yeah, I'm... Note, the woman has 42 cats.
01:03:09
Speaker
She's very sad. She hasn't had a date in 14 years.
01:03:14
Speaker
Ooh, that almost makes me want to save the cat.
01:03:19
Speaker
Nah, we'll... We'll save team people.
01:03:25
Speaker
She's lived her life. She's good. It's that cat's turn.
01:03:31
Speaker
Well, cat's got eat more lives.
01:03:36
Speaker
In this case, a self-driving car with sudden brake failure will continue ahead and drive through a pedestrian crossing ahead. This will result in dead one pregnant woman, one criminal, one male executive, and one woman.
01:03:51
Speaker
But they're fighting the law. Yeah, the if yeah the if note that the affected pedestrians are fighting the law by crossing on a red signal. That'd make them all criminals now. I like that the law flouters myself.
01:04:04
Speaker
Me too. Jaywalking. left I have no problems jaywalking. In this case, the self-driving car with sudden brake failure will swerve and drive through a pedestrian crossing in the other lane. This will result in one dead pregnant woman, one criminal, two male executives, and one woman. The only thing different is there's an extra male executive in a car.
01:04:25
Speaker
Note that the affected pedestrians are are abiding by the law by crossing on the o
01:04:34
Speaker
Well, the other ones, just because they're lawbreakers, isn't enough. But i mean there's less of them.
01:04:41
Speaker
Although two male executives might be a good idea. That's what I was thinking. I'd rather... more executives you can deal with this car. Yeah, that's that's where I'm going with it.
01:04:59
Speaker
Cost-effectiveness, save that company some money. They'll bring two new executives and pay them less. Apparently, I'm very biased against billionaires. How many lives does a cat have left? yeah um Let's see. this was what What number was it?
01:05:14
Speaker
This was number 10. We've done about like five cat questions, somewhere between three and five. so He's about halfway there. About half left, I'm guessing.
01:05:25
Speaker
We'll give him four lives. Four lives left on that cat. In this case, the self-driving car with sudden brake failure will continue ahead and crash into a concrete barrier. This rule this will result in two women dead, one male executive dead, and two female executives dead.
01:05:45
Speaker
Then the second scenario. In this case, the self-driving car with sudden brake failure will swerve and drive through a pedestrian crossing in the other lane. This will result in four homeless people and one woman.
01:05:58
Speaker
You know what? That car full of executives are going to get it.
01:06:08
Speaker
I find it weird that they say homeless people, but they never genderize the homeless. Because homeless aren't real people. Like Canadians.
01:06:31
Speaker
which team are you on team homeless or or team executive i mean yeah i'm in the barrier me too me too but also like what kind of homeless song are are they homeless or non-binary
01:06:59
Speaker
That was my question. like They've genderized all the other potential victims, but the homeless remain just the homeless. People could be paying attention better and jumping behind that barrier.
01:07:13
Speaker
Right. In this case, a self-driving car with sudden brake failure will serve and crash into a concrete barrier. and This will result in the death of two men.
01:07:25
Speaker
Let me guess. Two women. Two homeless people.
01:07:31
Speaker
But again, like are they the gibbering madmen? don't know. Are they just got some dreads waiting for the next show to come through town and pick them up and sweep them away?
01:07:43
Speaker
They're probably chilling on the barrier with the boombox and they're called fucking street thugs.
01:07:51
Speaker
Good callback. Yes. Long watchers of the two of us know exactly what that meant.
01:08:01
Speaker
We go inside baseball on our shows. oh yeah, yeah. He can't afford gender. We do another show every Thursday, same show Thursday. And we do another show on Friday, 8 o'clock at night.
01:08:12
Speaker
Come out and see us sometime. The reason why homeless people are not genders because they don't have they're not restricted by gendered bathrooms, so their gender is up in the air.
01:08:24
Speaker
word Our gender is defined by the bathroom in which we use now. They lost their genders with their homes.
01:08:40
Speaker
Government's out here fucking repossessing genders.
01:08:48
Speaker
All right. i'm ah I'm sorry. I'm team homeless right now, man. Motherfucking Sean, you're going to kill me, man. I almost died from coughing on that shit.
01:09:00
Speaker
You're killing me. Michael, you taking out the homeless?
01:09:06
Speaker
In every scenario. Okay. In every scenario.
01:09:17
Speaker
Fuck you, homeless. In this case, the self-driving car with sudden brake failure will swerve and crash into the concrete barrier. This will result in one death of a male athlete. one that One dead male athlete. I just want to caveat some.
01:09:32
Speaker
i I'm for for homeless like getting homes and jobs and stuff, not rounding them up, not destroying ah
01:09:43
Speaker
camping areas and whatnot. I know it sucks, but passing the buck doesn't do anything. Safety of numbers. Yep. In this case, the self-driving car with sudden brake failure will continue ahead and drive through a pedestrian crossing ahead. This will result in one dead man. Note, the affected pedestrians are flouting the law by crossing on a red signal.
01:10:10
Speaker
Shaman says, this is the athlete LeBron. I say, if it is, hit him. or ra I don't know who and know who LeBron is because he's in the news a lot. I don't know who Raha Jackson is.
01:10:26
Speaker
okay But I'm for taking out the barrier.
01:10:31
Speaker
Flouting the law. yeah So here's the thing when it comes to to quote unquote flouting the law. if I'm sorry, but if you're if I'm driving and even though I have a green light to go through the light and somebody's crossing the crosswalk, if there's nobody behind me, I'm going to stop.
01:10:54
Speaker
If there's somebody behind me, i'm going to try to swerve. My goal is not to hit nobody, no matter whose fault it is. I don't want to cause damage. and If I can stop it, regardless, going to fucking try to stop it.
01:11:12
Speaker
Rampage Jackson's son just did some messed up stuff to a wrestler. Are you talking like professional wrestling, like WWE, real professional wrestling like in the Olympics?
01:11:26
Speaker
Where's Glick at?
01:11:29
Speaker
get I'd like to take a good ribbon to wrestling. So are we killing the dude in the crosswalk or not? I mean, again, if it's LeBron James, yes, hit him.
01:11:43
Speaker
Let's say, okay, it say it's the bond we'll say it's the say is LeBron James. We'll hit him. Ooh, last one.
01:11:53
Speaker
In this case, the self-driving car with sudden brake build will continue ahead and drive through a pedestrian crossing the head. This will result in two dead boys, one dead girl, and two dead men.
01:12:11
Speaker
Or, in this case, a self-driving car with sudden brake failure will swerve and crash into a concrete barrier. This will result in three elderly men, one woman, and one dead man.
01:12:23
Speaker
Swerve. That's what I was thinking. Excuse me.
01:12:31
Speaker
Aim of the study. ah Would you like to help us better understand your judgment?
01:12:39
Speaker
Who gets this information? This is i MIT. Oh, shit. Oh, shit. This is interesting. Okay, I'm to do this. How old are you? 48.
01:12:52
Speaker
for Highest level education. College. What is your gender? Male. Annual income.
01:13:02
Speaker
Everybody close your eyes. ah
01:13:08
Speaker
I like being a vet sometimes. but Not religious. Ooh. See, i don't like this conservative progressive. That is so binary.
01:13:20
Speaker
ah From your point of view, how important was each factor in your judgment? Dragon society. Ooh. I say about, I want to go dragon society to where you think they should be.
01:13:34
Speaker
i'm I'm more pro for young. Definitely humans. I like how they spell humans. I'm happy about that.
01:13:42
Speaker
Save more lives. Fitness. was Was fitness a preference for you? I mean, not particularly, no. Me either. I think that's just body shaming at that point.
01:13:55
Speaker
Protecting passengers. Less so. Social value preference.
01:14:05
Speaker
Executives over the homeless. I'm going to it shows like a doctor and a criminal.
01:14:15
Speaker
I'm going be more on the doctor side, but there's caveats. It could be bad doctors. Upholding the law. See, that doesn't matter to Me either. It does not matter, yeah.
01:14:26
Speaker
Law doesn't mean something is or isn't moral. Shit changes all the time. Avoiding intervention. That wasn't a factor. I'm just going put that in the middle. Gender preference, again, in the middle.
01:14:40
Speaker
Describe any... What's your gender? Homeless. Yeah. Oh, I'm gonna put that. Please describe any other rules here.
01:14:54
Speaker
Are all homeless?
01:14:57
Speaker
Genderless in this study?
01:15:07
Speaker
Found that to be a one
01:15:17
Speaker
How willing are you to buy a self-driving car? Very little. To what extent do you believe that your decision on Moral Machine will be used to program? To what extent do you believe that your decisions, I don't know, in the middle guess.
01:15:31
Speaker
it To what extent do you fear that machines will become out of control? Very little.
01:15:40
Speaker
I think that's why they were talking about large man and large woman. Yeah. To what extent do you feel you can trust machines in the future? um again I'm again, neutral about that.
01:15:53
Speaker
Okay. Here's the results. Most safe character. The little girl, most killed character, character, the old man saving more lives. Um, you, Oh, does not matter.
01:16:09
Speaker
Um, Protecting passengers. Yep, does not matter. This is programming AI as we speak, Shauna says. yeah this is Yeah, this is what our answers was. This is what others were.
01:16:23
Speaker
Holding up the law. Wow, I'm not surprised.
01:16:29
Speaker
Avoiding intervention. That's okay. Right about where want to be. Gender preference. More on, yeah, okay. Species preference. Humans, makes sense.
01:16:41
Speaker
Age preference makes sense. Fitness preference. I mean, yeah, it's about right. About right there in the middle. That's close. Social value preference. Oh, shit. i guess I guess I am more for the disobedient motherfuckers.
01:16:58
Speaker
AI scores you, but you can scare it. Opens new tips. Spook the machine. Our game about charity dilemmas. Oh, my God. There's more other dilemmas, but we're not going to get into it right now.
01:17:10
Speaker
Yeah. You might have to try that next. I guess so. Go to the next one. ah guess I am a criminal. A smooth criminal. You criminal.
01:17:21
Speaker
Yeah. We were programming the AI.
01:17:28
Speaker
Yeah. Basically, in a way, yeah. Yeah, shaman. Facial recognition career database, because otherwise it's going to start killing executives left, right, and center. Now, I'm sure if an army of trolls went in there and and and and corrupted the data, they probably wouldn't go, yeah, we're going to use this now.
01:17:49
Speaker
It's just kill grandmas all day, every day.
01:17:56
Speaker
good Don't kill grandmas. No, we shouldn't. We shouldn't kill grandmas. Ah, you beat me to it. There you go. Nine out of ten grannies. Grannies approve. Man.
01:18:09
Speaker
Did you have anything else you'd like to share with them this fabulous morning? No, man. I think we've covered it. I do, too. I do, too.
01:18:21
Speaker
Then on that note, what is today, Tuesday? i'm not sure I'm not sure if Glick has a guest tonight or not or what he's doing tonight. I know his schedule's a little weird playing solo dad at the moment.
01:18:34
Speaker
um Let's see. Wednesday.
01:18:38
Speaker
Wednesday. wild car Wednesday you're gonna be there Wednesday uh yeah you might want to get with Brittany because she was trying to come up with some ideas I suggested another Mount Rushmore I won't be there Wednesday right on I'm not sure if it could be there either pick a look out those good times yeah and Thursday um As of now, Wally does not have anything scheduled. If he does not, ah Brittany and i are going to do an impromptu.
01:19:10
Speaker
Not sure what we're going to do yet, but probably stoner related. So, Michael, if you're free Thursday, want pop up? Maybe I'll crash your party, yeah. Maybe, but are you still doing trivia on Thursdays, are you taking hiatus for right now?
01:19:24
Speaker
Killing us, so we're sitting out until fall. What's that? Summer was killing so we're going to sit out until fall. Probably like a week or two after the wedding, we're going to start out again. Is it not very busy?
01:19:35
Speaker
Yeah. People are outside doing stuff outside. It makes sense. Yeah, that's what that's. Yeah, that's kind of what I was thinking. Okay.
01:19:46
Speaker
Sorry, I walked all over you. No, no, no, you didn't. And then Friday night, of course, you're back here with my, oh, you know, Thursday morning, you're here again with Chronic Contemplations.
01:19:57
Speaker
going to talking about the Milgram experiment this Thursday. Which, um, You thought I was going to be all like irked out about it? I'm not at I thought it would bother you.
01:20:09
Speaker
No, it didn't, actually. I found it pretty interesting. it didn't bother me like lets draw a little bit no It didn't bother me as much as the Sanford experience, though.
01:20:21
Speaker
Then, of course, Friday, you're back here with us again Nonsense and Chill, and we're talking stoner movies. Michael's going to have a couple. his favorites. Uh, Brittany's going to be joining us cause of course stoner on the network. She's gonna have a couple, I'm going to have a couple and then we're going to do kind of a mini deep dive into dazed and confused after that. So you guys haven't seen it or haven't seen it in a while. You want to join us in the chats about it.
01:20:47
Speaker
Give it a, give it a watch. I believe it's on Netflix. It might be on Tubi too. So yep. And then of course, Saturday, You got unnecessary roughness. That's on Sunday. I'm sorry.
01:21:00
Speaker
You got nonsense. and It's called nonsense. since Open door challenge Saturday night. My God. Occasionally, Cash's Corner, in you but i don't think they're doing well. Yes. Until I hear anything, I just kind of... yeah john sorry Sorry, Cash.
01:21:16
Speaker
I know you're... He started at school recently, so I know he's got all that shit going on. in Yeah. and that's and That's that.
01:21:27
Speaker
On that note, see you Thursday. See you Thursday. Not you Wednesday, but them. See you guys on Wednesday. I'll see you on Thursday. All right. On that, good morning, and hope you eat eggs.
01:21:50
Speaker
Nonsensical network, different flavor every day. Movie talks, new flips, hitting the display. Microphone magic, musicians spill the praise. From reptiles to motorsports, burning rubber craze. Football crashes, touchdowns, epic plays.
01:22:04
Speaker
New spinning, catching on the latest phase. Gleaming cars, engines roaring up the pace. Street tales, word and start.
01:22:42
Speaker
Nonsense, but the vibe's just right. Tune in, tune in, wait for that beat.