Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Chronic Contemplations - A.I. Discussion image

Chronic Contemplations - A.I. Discussion

Nonsensical Network
Avatar
15 Plays6 days ago

#NonsensicalNetwork #ChronicContemplations #comedy #talkshow #morning #fyp #AI

GOOOOOOD MORNING!!! With these two half asleep hosts, maybe you'll get one coherent one!! Join Michael and Josh while they discuss Artificial Intelligence from the perspective of Stoner Intelligence.

NETWORK Links: https://bio.link/nonsensicalnetwork

Copyright Disclaimer: - Under section 107 of the copyright Act 1976, allowance is mad for FAIR USE for purpose such a as criticism, comment, news reporting, teaching, scholarship and research. Fair use is a use permitted by copyright statues that might otherwise be infringing. Non- Profit, educational or personal use tips the balance in favor of FAIR USE.

Recommended
Transcript

Introduction and Banter

00:00:15
Speaker
That's loud in the ear balls. Hi, good morning.
00:00:21
Speaker
Good morning. I'm Josh. ah Also, some y'all know me as Blazing. This is Chronic Contemplations. I'm with my awesome co-host, Michael. How are you doing, my man? Awesome. Awesomely bad.
00:00:32
Speaker
okay at best, bro. I'm okay at best. Man, I'm good. I'm good. I'm good. It was a wild night. Dude, I got to tell you, yeah we're going to take a little tour here.

Trivia Night Mishap

00:00:42
Speaker
You know I love trivia.
00:00:44
Speaker
yes i suck at it i hosted trivia i'm way way better as a host than as a player i think i'm better as a player than a host see i torpedoed my entire team and um it yeah it was embarrassing we were in first place dropped a second after two rounds by the time it was over we finished fifth out of seven teams whoo brutal dude i was a little tight was a little drunk up and feeling uh froggy but man i was 100 sure twice put in the mug because this his uh is a little different than your setup you actually bet points instead of like just get points yeah
00:01:25
Speaker
So there's so many point values. And I was like go the go the maximum, go the maximum. I'm right for sure. Had four other members of my team agreed with me. One guy was like, no, man, it's Argentina. It's not Brazil. It's Argentina.
00:01:36
Speaker
We're like, no you're wrong. Fucking Argentina. I said, all right, old man, my buddy, old man Johnson. I said, do you want to punch me in the face now or later? ah That was a lot of fun, though.
00:01:47
Speaker
I never had so much fun

AI's Current State and Limitations

00:01:49
Speaker
losing. It was a good one. Your issue is you probably have so much knowledge rattling around up there from all the trivia questions you've told over the years.
00:01:58
Speaker
It's like the process of selecting it properly. Yeah, the the Rolodex is a little rusty. Yeah, that's a good way. I like that analogy. Yes. My recall is just eating shit.
00:02:10
Speaker
ah my ah My sight's going more and my elasticity of the brain is not as what it used to were. I'm not as smart as I used to could.
00:02:21
Speaker
All right, gang, you're going to get to see Uncle Daddy with his glasses on today because if you're still having the issue you had last night, you had problems doing the internet.
00:02:32
Speaker
I'll handle it. Yeah, I had some. Yeah, I can hit the. yeah No, I'm a better sound mind, sober mind. Hey, now. that's my last night yeah last yeah Last night was, ah was that what's that?
00:02:48
Speaker
Look at the ring. Do you see that with my eyes? Uh-huh. Shining off your glasses. That's the ring light. Oh. Yeah. You can tell mine's a square but if you look at my eyes.
00:03:04
Speaker
Oh, okay. Yeah. Yeah. I'm pretty high. That's why my so I also keep mine at an angle because I don't like the light. i don't the That me personally in my head bothers me.
00:03:18
Speaker
Anyway. Last night was an impromptu for me to come up and stream, so I want to apologize everybody. I had fun last night. Yeah, it was obvious. I had fun watching
00:03:32
Speaker
it. We are here. ah Losing a trivia, but boy, that was i was a really fun episode. o Bram Stoker. Ram Stroker. All right. You guys read the description. If you guys read the description that you're gonna you're going to get a dose of stoner intelligence discussing artificial intelligence.
00:03:50
Speaker
So that is what we're doing. Rusty Rolodex. God, probably misspelled Rolodex. I don't know how misspelled Rolodex anymore. I was thinking Rolodex. That's a Rolodex. Okay, I was thinking Roloids. Rolos?
00:04:03
Speaker
Rolos? You know those little chocolate caramel things? Yeah, that's one L and so is that. I fucking love Rolos. Anyways.
00:04:11
Speaker
So we're discussing AI today. um And I didn't do much to prepare for it considering the fact that i I've already come to my conclusion on on ai you Okay, let let me rephrase that.
00:04:27
Speaker
My conclusion isn't absolute because where we sit on AI today is not its final conclusion. So of course, my my yeah my conclusion of AI is definitely going to going change over the years. But as AI sits right now, um I don't think it's, quote unquote, the AI people think about when they think of like Terminator or something like that to take over the world.
00:04:51
Speaker
I think... Cyberdyron

AI Autonomy Concerns

00:04:53
Speaker
Industries. Yeah, I think the way AI sits now is so restricted that it can't really do much other than it just be a basic annoying term.
00:05:05
Speaker
Or a positive tool if it's being used right. But I say annoying because there's i've there's instances where it's been misused in our own legal system.
00:05:19
Speaker
Are you familiar with um scrubbing the internet? What it means to scrub the internet? I'm sure you are.
00:05:25
Speaker
like to Like to scrub like all your information from the internet? I know I've heard of that, but I don't know how. So Google, they're just a small mom and pop company, right?
00:05:37
Speaker
yeah They unleashed, think it was Zuckerberg. See, I can't find the goddamn information. It was all over the webs when it happened. might have been a Zuckerberg one.
00:05:49
Speaker
Either way, it happened. They had an AI. They got turned it loose. It found another AI on the internet. And they started talking to each other.
00:06:01
Speaker
in a language nobody could fucking figure out. and and hours Within hours, they found each other. Interesting. I did hear something like that, but again, I don't know what AI platform, there's so many of them out there.
00:06:16
Speaker
ChatGPT, one off top my head, Google's got theirs. This was way before any of that stuff. Oh, okay. Those are the ones for public consumption. These were just test runs. i will They were open to the public.
00:06:27
Speaker
And the ah the designers, the engineers or whatever were like, new shut it down. Shut it down. Cyberdoms about to start. Yeah, man, it was crazy. Two AIs found each other within hours and communicated to constantly in and in the language nobody could figure out.
00:06:45
Speaker
Now, when they say they found each other, was was was the, because it seemed like they were running data experiments, however you want to put that research study. whatever ai I mean, if they found each other, it seems like they were on the same system or a way in order to find each other.
00:07:04
Speaker
i don't know. It wasn't. i It was not a closed system. They they were they were two different independent AI that were in the ah they had access to the net.
00:07:18
Speaker
Interesting. and was It wasn't just in a building. they They had access to the net and it freaked everybody out. Oh man, shut it down. Shut them both down. And then after that, all the experiments they ran were all closed system to make sure everything was copacetic.
00:07:34
Speaker
Yeah, that I found super, super interesting. Maybe it was nefarious. Maybe it wasn't. That's the thing we don't, I mean, unless, unless we understand what that language is, I guess we don't know.
00:07:46
Speaker
That is pretty interesting. I mean, language is that pliable that you can just come up with your own language. mean that I mean, humans do that. Like twins, I guess twins are some twins of have known are known to to make up their own language. I've always found that interesting when people make up their own language.
00:08:06
Speaker
um You've heard this phenomenon, right? like i Like twins, like twins you know they grow up together, know fraternal or identical, whatever you want to call them, twins.
00:08:18
Speaker
they'll have their own language. go they'll They'll develop their own words just between them two, kind of their own secret kind of code. And that's ah that's a phenomenon that happens.
00:08:30
Speaker
It's pretty interesting. How about when twins, are we walking off AI? How about when twins can like, one feels the moment the other one dies or something? Okay, that i'm i don't i don't I don't know much about that. That seems, that seems,
00:08:46
Speaker
impossible, but I don't know. um i just know the language thing. and i know yeah they like The language thing is a proven thing. You can actually, you know, there's twins that do it even to this day.
00:08:59
Speaker
um But so AI doing it. Cool. Like I, it's like you said, unless we know if it's in a nefarious or so not, I don't know. I think again, that's like an unknown and we,
00:09:15
Speaker
we get we so we get scared of the unknown sometimes. So, I mean, I can see that. and the russell You know, the tiger rustling in the bush. you don't know if it's a tiger or not. So, you cut it off. You don't know if the AI language is bad or ah good, so you just turn it off.
00:09:30
Speaker
Yeah. They didn't know what was going on but they were like, hmm, this is weird. Shut it down.

AI's Impact on Cognitive Skills

00:09:38
Speaker
Right now, what we're consider what we're calling AI I think the title of AI is more of a marketing thing than what I would actually consider AI.
00:09:50
Speaker
Yeah, it used to have a ah very streamlined, help me, definition. Yeah, yeah. Now it's a catch-all term. It is. Yeah, like I say, because I think AI is so restricted right now. And I've brought up on this network the whole whole wine pair the glass of wine paradox, which I think by now I think they've actually...
00:10:12
Speaker
You can actually find a full glass of wine on AI can make those now because it was a thing. And because, see, that's the thing with AI. it doesn't It doesn't recognize something. Idiom. so So you program it or you reprogram it you change the programming.
00:10:28
Speaker
Like Twitter or X's AI, Glock, it's gone through different revisions just so it would reflect the owner's perspective.
00:10:40
Speaker
Hmm. like it really because is when grok they taught ai bias no well no hold it no it was programmed what happened was ai grok ai was was you know being asked questions and it was it was contradicting what elon was putting out on his own on ah on his own thing you know grok was releasing factual information And Elon was pissed off about it, so he reprogrammed it a couple times. Yeah, yeah. So, again, it's just the tool. And my biggest my biggest concern is not so much the tool itself right now, but who hands are on the buttons.
00:11:21
Speaker
Exactly. Like you were saying the one night, you've got to remember to stop being mad at hammers for being hammers. Yeah, yeah, exactly. Be mad at the one holding the hammer, yeah. Guns don't kill people. People kill people. that means That's the same. And the same people that go around saying that are the ones that are scared of AI, but they're not scared of a gun. It's so weird.
00:11:41
Speaker
It's the same kind of logic. And I say it because there was a court case that, and I think it made it was i think it made it to the federal courts, that a lawyer had an AI right his breed.
00:11:56
Speaker
it's brief And the court cases that were outlined in that brief for his argument were made up by AI. And there was no fact. court and may it made up it made up um It made up false court cases.
00:12:14
Speaker
And there was no proofread. This lawyer didn't proofread shit. He ended up getting fired from his from his law firm. off That was a huge fuck up. good But that's like a testament of just laziness.
00:12:29
Speaker
I'll tell you the truth. yeah Speaking to that, I had a conversation with a friend of mine, Mike. How many phone numbers did you know before you had smartphone?
00:12:40
Speaker
A hundred? mike my home I knew my home phone. I knew my best friend's phone number. I knew my girlfriend's phone or if I had one at the time or not. Maybe like the pizza place.
00:12:53
Speaker
ah Maybe like four or five. like and I didn't need that meeting. and been out a hundred now I'm sorry you had no friends. Oh, no i havet no. I've always had a hard time fucking remembering numbers off the top of my head.
00:13:05
Speaker
I'm serious. During my early years fucking dating and shit, like, what's your number? she A woman would rattle it off to me, and I'm like, fuck, no pen and paper. I guess I ain't calling her.
00:13:19
Speaker
That's good. I suck at memorizing numbers like that. yeah you're You're like numbers like I am with names. Yeah. but i mean You know what I mean, Jim? I suck. Yeah, right. I suck at names too. It's ideas. It's ideas that I remember. Smart technology on our phones is making people dumber.
00:13:36
Speaker
I know my phone number. I know my mom's. That's it. Oh, shoot. You're not wrong. I mean, if I don't see a name pop up and it's just the number, I'm like, i don't know the fuck that is.
00:13:48
Speaker
Like, I've often thought about sitting there and erasing names and just start looking at the numbers, remembering the numbers when they call. But I get so many spam calls. Yeah. That's another big reason is along with the new technology and all these spam calls, sometimes I'll answer them because I'm hoping that there's a real person I can fuck with on the other end.
00:14:12
Speaker
But it never is. It's an AI-generated voice. Wow, man. Yep. And a lot of advertising. Okay, so I like listening to debates and lectures and stuff on YouTube.
00:14:25
Speaker
And other places like Nebula and whatnot. Although Nebula is a paid service and I had to cancel that subscription. I'm sad about that. But anyway,
00:14:36
Speaker
um so, and I'll be it'd be i'll be scrolling through you like, you know, I need a refresher on Hegel's dialectical. You know, so just something off the top my head.
00:14:47
Speaker
And I'll scroll through and I'll type that in the search engine.

AI in Creative Fields

00:14:50
Speaker
Hegel's dialectical. And I'll i'll see, ooh, this one seems pretty interesting. Let's click on that one. And it's not even a real person talking.
00:14:59
Speaker
It's somebody who's who's typed out what they want to say, and they upload it to one of those AI voice yeah generation things. And boom, it's like, ah, I can't do that. i hate the way that AI people talk. as no There's no essence.
00:15:18
Speaker
And okay, so... In a little preparation for this conversation, I did i did watch about a 10-minute segment of Zizak, Slavzav Zizak, talk about AI.
00:15:31
Speaker
And he brought up an interesting point that this is why he doesn't fear AI right now, especially. And I kind of agree with him the way we should employ it should be more... But we'll get into that later. But AI...
00:15:50
Speaker
ai when it comes to ah human essence. ah You know what a happy accident is or a happenstance or serendipitous? I sure do. i love Bob Ross. right Yeah, you're sitting there, you're you're your're like penicillin was a happy accident. Like that wasn't the goal.
00:16:07
Speaker
AI doesn't know what that what a happy accident is. They would be programmed to find a certain solution to a certain problem. And if they were to accidentally find a solution to another problem, well, they're not programmed to recognize that. So they'll ignore that.
00:16:24
Speaker
Us humans would be like, oh, wow, that was a happy accident. Not what I was meaning for, but hey, that works. And this this the solution will help in this problem. Ask AI, what is a happy accident? and It shows you a video of an old man falling and people laughing. at him
00:16:42
Speaker
Depends on which old man. I don't care. all All people falling is funny to me. I don't care.
00:16:52
Speaker
i was school I got a story of an old man falling in the woods that Molly knocked over. You told it last night. I heard it. Oh, did I? it was Yes, you told it last night.
00:17:03
Speaker
Oh, my. Maybe that's why it's fresh in my head. I commented about it. I love it when old people fall down, I said. Old, young, i don't give a shit, fall. Entertain me.
00:17:15
Speaker
so So AI, and that's why when you look at AI-generated images, they look distorted. They look um looks something like demons would make, man.
00:17:28
Speaker
I don't know how else to say that. I don't believe in demons and stuff, but if I was a demon I wanted to fuck with humans, I was like, ooh, let's make some false videos or whatnot. False videos.
00:17:40
Speaker
Here's one. You know who... ah um Oh, God dang it. He's a newscaster. I think he's on CNN. Or was on CNN. He's not there anymore. That's not AI art. You see that?
00:17:52
Speaker
Not AI art. A real person. That is not. That's not AI art. Anyway, this news this news journalist... literally posted it AI video of AOC standing in Congress talking about this whole Sweeney Todd, Sidney Sweeney freaking culture war bullshit. It was an AI-generated video.
00:18:16
Speaker
He posted it thinking he was absolutely real, even in the top of the video saying it was AI-generated. He got called out on it, and then he wanted to backtrack saying, well,
00:18:28
Speaker
backtrack saying well If I got duped by it, that just means there's some truth behind it, which is a weird defense. but but I can't be that dumb. Right?
00:18:40
Speaker
But you were and But if you watch it and you don't see any of the warnings or anything and pay attention to it, the like the sound and the lip syncing, the dubbing is off.
00:18:52
Speaker
If you're not paying attention that, I can see how somebody can be duped by it. But he's a journalist. that was the That was the interesting part. Like this AI generated video is duping a journalist.
00:19:07
Speaker
Just because you're a journalist doesn't make you smart. This is true. Especially you're doctor doesn't make you smart. Actually, the journalists today are just lazy. i don't even know why they're called journalists anymore. They're not really journalizing anything. By and large, humans are lazy.
00:19:22
Speaker
what's that by and large humans are laz Oh, absolutely. Well, if you actually, if you look at legacy legacy media, they're like the pretext or the ah pre-AI. They're just programmed fucking talk and shit.
00:19:35
Speaker
if They're the real artificial intelligence. Anyway, so... Artificial intelligence versus ah real stupidity. Go.
00:19:47
Speaker
Okay, I got a question for you. And I'm going to, I am, ah I probably will go back around to slavs ahsla ah Slava Zizak. just going to call him Zizak. Because did on something. Slava Zizak is a current, he's still alive, he's like in his 70s, modern philosopher, author, he's from like Eastern Europe, I want to say Slavia.
00:20:17
Speaker
He's Slavic. um He's got Tourette's like a motherfucker, though. He's very... he like You think I'm eccentric. Oh, my God. That dude is like eccentric wrapped in autism. but But he is... he So this is where him and I disagree. He's more Hegel, and I'm not. He believes in... He's an idealist.
00:20:41
Speaker
ah Slavoff is. I'm not. but However, when he was talking about AI, he brought up a ah something that, of course, this is This is a ah critique that a lot of people not going to like.
00:20:54
Speaker
but But people are are are noticing it's going to be a job killer. It's going to take jobs away. and And I think we need to not be scared of that. I think we need to to embrace that. I'm really glad you said that. Embrace that as a society.
00:21:13
Speaker
Because here's the thing. like like i'm I'm for like a more... but maybe maybe UBI maybe in the far future. No, but ah currently, I think UBI work because the more of this future technology, hold to hold on, on youbi you hold on, UBI, UBI, UBI, universal basic income.
00:21:31
Speaker
Thank you. Yeah. Sorry. Sorry. Sorry. So with that being said, like when these newer technologies come on, we're going to lose out on jobs because these robots, AI is going to be running the rails ah um lot of auto, of,
00:21:48
Speaker
ah autonomous, autonomous ah public transport and stuff. And I'm not sure, I'm not saying like right now it's all safe to do so. That's just going to be coming online and we should accept that as a society and enjoy our deserved, well-deserved leisure time and let these robots do the work for us.
00:22:09
Speaker
And, but that's going to come along with kind of ah taking some power away from people up top. It's just going to have to be that way.
00:22:23
Speaker
But I think it's it's a good tool that will help society be a better society. I really do. I absolutely do. As far as autonomous like workers, in the yeah auto industry, for instance for example, I worked for GM for a few years.
00:22:39
Speaker
ah when my plant closed down, best thing in my life it could ever happen. i thought it was a disaster, but if not for that, it would i wouldn't I would have never gotten into comedy. Yeah. Happy accident.
00:22:51
Speaker
we had had We had a robot. It was just an arm, a mechanical arm, a robot arm. It picked up the spare tire, set it into the trunk, and then boop, boop, boop, screwed it down.
00:23:04
Speaker
You know how it's in there? It fucked up so it fucked up so often. I worked there all the time with a stupid robot out of the way while they're fucking fixing it. And I put those tires in there all the time.
00:23:15
Speaker
yeah The robot couldn't handle, you had one job. The robot couldn't handle it. i mean But also this was years ago. this it was glitchy still. Yeah, yeah.
00:23:27
Speaker
I mean, and I think robots building Nissan's and Honda's, they were jamming it up. They were doing great. Well, see, and see, in and I want to use that as an example. Like the robot then were bad.
00:23:38
Speaker
The robots now are better. AI now is bad. AI later might be better. Or it could be. What do you mean by AI is bad? I'm not saying AI is bad. I'm just like, know, I know you don't mean it like in the moral sense, but yeah, no bad as in a rudimentary, uh, still elementary, still, still infancy, still, you know, okay. Yeah.
00:24:01
Speaker
Yeah. Bad is a lot of room let yet to evolve and improve. that Yeah. Yeah. Just like the, the bad robot that kept fucking up a lot. I had room to improve. Bad robot. JJ Abrams saying that bad robot. I'm sorry. I'm,
00:24:16
Speaker
I'm channeling my Jetsons and I'm scolding Rosie.
00:24:25
Speaker
Yes. Little generational joke there. You don't know who the Jetsons are. Hanna-Barbera was the shit. Hanna-Barbera was one of my favorite. Anyway, I'm not.
00:24:36
Speaker
and like i You know what? AI and art. Let's talk about that. Fuck that. You think AI is going to freaking write jokes better than comedians? Never. Never.
00:24:47
Speaker
I agree. It's a soulless thing.
00:24:52
Speaker
You can teach, you can program so many things, ah but a self-learning AI potentially could get it eventually. But there's so much nuance.
00:25:05
Speaker
Unless, say, you're Steven Wright, for example. Not a lot of nuance there. It's the same thing over and over again. Deadpan delivery, weird topic. I spilled spa remover on my dog. Now I can't find them.
00:25:20
Speaker
Hey, I can't pull that

AI and Future of Work

00:25:21
Speaker
off. I'm laughing, yeah. yeah I'm laughing at not at the joke, but at the the attitude that you're, yes, I get to. um i think I think there's some there's some very dumb jokes that they could repeat that has already been written. I don't think they'd be able to write their own jokes right now.
00:25:41
Speaker
Or if they did... Maybe not yet.
00:25:45
Speaker
I kind of, you know, and I kind of want, you know what? i um Hold on one second. I'm going to drop a link in the comments. Actually, funny ass guys, Kurt Braunohler, I believe that's his name, Kurt Braunohler.
00:25:56
Speaker
he He and some ah ah robot, not robot, goddammit, electrical engineers whatever, computer engineers wrote algorithm for a joke bot.
00:26:11
Speaker
And on stage, he brings JokeBot out, gives it a premise, and it analyzed dozens of different comedians, including Kurt Braunohler, and told jokes.
00:26:22
Speaker
Some of them were hilarious. Some of them were terrifying. It was great. That would be kind of kind of what AI does now. I mean, in a very more simplest form. I mean, that's interesting, JokeBot.
00:26:37
Speaker
Yeah. I did not know that was a thing. I don't know if that came first or the JokeBot episode South Park came first. Oh, i I didn't know there was a South Park reference. I don't watch a lot of South Park.
00:26:50
Speaker
Well, South Park made a thing called an episode with a JokeBot thing. It was weird. how How long? wait What season was that? I'll have to look it up.
00:27:01
Speaker
Was it a current one? Oh, no, no, no. And I i and i only ask, because this whole AI stuff is Prominently, the way it's existing now has been on our social zeitgeist since, what, the last decade, decade and a half, really?
00:27:18
Speaker
so i mean So, I mean, I was thinking maybe within that period, that's kind of when they did that. As an AI spoof.
00:27:26
Speaker
That's why I was asking. Like a parody of AI. You just have to watch the episode. Anyway, was not the point of art.
00:27:37
Speaker
art has soul art has feeling that's one of my best friends that's my brother ernest what up big earn you were two people i need to keep apart forever but you're gonna see them why do we hold up hold up though you can't say something like that and get away with it we'll talk about okay good to see you i love you brother
00:28:04
Speaker
You and Josh would have an interesting conversation, but ah only if he's not drinking. ah Ah, fair. Fair enough. I can't read it. I got you. I got you.
00:28:16
Speaker
I think AI does this stuff for us. How powerful and what tasks is the Oh, I think if AI does all this stuff for us, how powerful and what tasks is the government's AI doing?
00:28:33
Speaker
I wouldn't call it. So... Okay, government's AI.
00:28:42
Speaker
In my head, when when I think the application of AI to help society, I'm also thinking in the tech and the context of no government, because again, my political leaning is no government.
00:28:56
Speaker
So it would be people's AI. the people's ai It would be a tool. Just like the people's hammer. I'm glad you said that. Check this comment out.
00:29:06
Speaker
you Government has a different version than we do, says MK2000. They have a different... Like, what do you mean, version? It's like anything else, right? The military gets it first. Oh, you mean they have a more advanced... Okay, like DARPA. shit like I agree. i They probably do.
00:29:25
Speaker
Just like from... And I don't know how true this is. It's actually something I've been meaning look into. But apparently our internet is throttled. Like it would go faster if it wasn't throttled back by our government.
00:29:38
Speaker
That's just what I've heard. I'm not sure how true that is. We can't get the information faster. But I do agree that the the government would have a have access to ah more advanced technologies. Because you're right.
00:29:55
Speaker
Stuff gets released to the public. After our government's. So this is, you know, I'm going to touch on something. Our tax dollars go toward these research and development that our government does for this, for all this technology.
00:30:11
Speaker
And then when we do get it after we, after we've already paid for its discovery, you know, we get it, it gets released. as patents for companies to buy, and then they get to make money and sell it, sell that technology to us.
00:30:26
Speaker
GPS, smartphones, and et cetera, all started in DARPA. So anyway, I just, and this is what AI, and this is, this is why i do, i do, I am sympathetic to what MK's asking about the government's version because it would trump ours. This is why I, is why I'm not a big fan of having governments.
00:30:48
Speaker
But, Yeah, i think there is I think that is a legitimate concern on, again, it's not so much the AI itself. It's who's fingers on the button.
00:30:59
Speaker
Absolutely. um Shit, I lost my thought. ai
00:31:08
Speaker
doesn't mean automated integrity.
00:31:13
Speaker
Ernst Lillian, one of my comedian friends, great dude, he's brothered me. Automated integrity. Damn it, you said something and then went somewhere else. I wanted to touch on something you said, damn it.
00:31:25
Speaker
All right, anyway, we're gonna backtrack a little bit. What automation can do and what AI can do for us. Yes. You and I have a common fantasy. We would love to see globally Star Trek happen, right?
00:31:42
Speaker
Yes, yeah, yeah. ah More, yes, an ingredient society right using everybody technology to assist us, but not, yeah, yep, yep, yep, yep.
00:31:53
Speaker
Now, what AI and automation ah can do for us is, You take away these jobs. Grant, you you can't get paid for that now. Fine. But you know what it does do?
00:32:05
Speaker
It frees people up to pursue other things. Art. Art. yeah Agriculture. We can grow our Things that advance culture and society. not Education.
00:32:17
Speaker
Knowledge seeking, research, studying, exploring. All the things that make humans human. I can't remember who the ah comedian was, but it's like, you're growing up, right? Your parents always tell you, well, not always, some people are shitty people.
00:32:30
Speaker
You can be anything you want to be. You can be anything you want to be. And then you're a kid that goes to like a vocational school. You can be any of these five things, i right?
00:32:43
Speaker
So when people when people just talk about the alienation or work being alienating, this is kind of what they're talking when you say, when you bring up being able to return back to more culturally progressive things like art and science and all those things that make humans human, you know, instead of the work.
00:33:08
Speaker
What was I saying?
00:33:11
Speaker
Motherfucker. Smoke another one, Blaze. No, I'm actually totally sober this morning. My God. That thought totally

AI's Role in Society and the Internet

00:33:19
Speaker
disappeared out of my brain. ah but My gosh.
00:33:23
Speaker
ah Pursuing a different things in art and culture. Yeah. And I had a point. Maybe the point, I think I was going to just reiterate this.
00:33:34
Speaker
Maybe that's why lost it. So it's no big deal. Where I'm coming from. Maybe it'll come out. AI and like a... Oh, I know what i was to say. oh Go, go. We would have, I think, if we were able to return to more of a agrarious sort of civilization, more leisure time, ah AI to do our work for us, we would be mentally better too. I think we wouldn't be so aggroed and fighting each other because I think we'd be happier, to tell you the truth.
00:34:06
Speaker
Sobertarity agey for kids.
00:34:11
Speaker
I think that's kind of the point I trying to I'm gonna step into one of my favorite realms here, what I'm about to talk about, ah but we're not gonna go crazy on it. They, they, the powers that be, don't want us to be happier.
00:34:29
Speaker
War is money. Death is money.
00:34:36
Speaker
A hege um ho much homogeneous society. ah homogeneous society almost can't happen here without a collective, large, we, the unwashed masses, marching and saying, no more.
00:34:53
Speaker
But with all the big governments having all the big guns, how we do that? How we do that? classic That's what it's going to take. Yes, exactly. An entire class consciousness paradigm shift.
00:35:06
Speaker
But even if there's too many people are asleep, asleep. You know what's funny? One of the things that and one of the things that Stalin did, I fucking absolutely disagree with, is AI slowly starting to be automatic like some of the dating sites.
00:35:26
Speaker
And we're going to go back to that after thought. Yeah. um Stalin, one of the things they that that regime incorporated was paying people with vodka, keeping the public drunk.
00:35:40
Speaker
And I'm not saying we're drunk on vodka like and like as a society in the US, but we're drunk on these motherfuckers. We're drunk on these.
00:35:52
Speaker
The fucking, the the constant doom scrolling. Like, this was brought up to me by a friend yesterday. They were somewhere with their friends, and they're sitting there wanting to talk, and all their friends are sitting around on their fucking phones.
00:36:06
Speaker
It's like, You can't have clash consciousness if we're literally in the same room ignoring each other. It's just crazy. It takes that. It further alienates us from each other.
00:36:19
Speaker
We are amazing at keeping ourselves content with nothing. Nothing. Because we want to be content. When I'm doing a live host gig, like ah trivia or whatever, if I see a whole table people, okay, here's a good example. It was a Halloween party.
00:36:38
Speaker
Everyone's partying, dancing around, it ah and I'm DJing instead of the DJ hosting. And a whole table of people almost right on top of my spot. A whole table of them.
00:36:52
Speaker
I shut the music down and just kind of went on a diatribe about people that are hanging out at an awesome party. Everyone's having a good time and dancing and they're just sitting at their table, oblivious, everything going on around them on their goddamn phones.
00:37:06
Speaker
Not one of their heads went, huh? Is he talking about us? Yeah.
00:37:11
Speaker
Yep. There's a, and I, I told this to a friend yesterday. There's a, there's a word for that or concept. I got to look it up, but it's the, um, it's it's not well it's not willful it will yeah it's definitely wolf it's um it's some sort of like detachment uh thing um it's like an anti-social thing and it's so weird is i'll sit here ty and maybe yeah it's like i'm the one who jokes from like i hate people i'm anti-social but blah blah blah no i just honestly i hate
00:37:46
Speaker
that people do the actual anti-social shit, which is this doom scrolling on phone. Don't get me wrong. i I'm guilty of doom scrolling. I call it doom scrolling. But I do it on my own time when I'm by myself, usually.
00:37:58
Speaker
um Well, I do it with people around, depending if I'm like looking something up we're in middle of a conversation or I'm checking on a notification. like That shit's fine. But when you're detached from the people around you because you're scrolling through your phone constantly, brrrr.
00:38:15
Speaker
It's like i want to chuck their phone in the river. Hey, let me say for a second. Smash! Yeah, exactly. just watch watch their entire soul just stink out of their body. Now, my man Ernest, he says, hey, I'm slowly starting to be automatic like some of the dating sites.
00:38:28
Speaker
More than 50% of all women on any dating site are a robot, like a bot chatter or whatever.
00:38:37
Speaker
Dude, this would be a good time to bring up dead internet theory. Oh, yeah. yeah Lay out the time. Okay, so, and you just kind of brought up dead internet theory is like we're over half the... Before

AI in Creative Processes

00:38:49
Speaker
get it, we'll read that one.
00:38:50
Speaker
Like the silent dance parties. You wear headphones so listen to listen the music and nobody talks. That is fucking weird to me. Weird. That's creepy as fuck. I'd rather do speed dating. That's cringy, but not as cringy.
00:39:03
Speaker
At least you talk to somebody. Exactly. ah Sorry to interrupt. No, so, the dead internet theory is the idea that over half, like,
00:39:14
Speaker
The internet becomes dead, quote unquote, when over half the interactions are all just soulless, human-less bots. And AI is going to drive that shit up.
00:39:26
Speaker
And you just said, like, and that's, I think that statistic is pretty damn on point. Oh, it is, for sure. Of the women. Yeah, but, so, of the women, that, oh, right, that, oh, like, in that,
00:39:44
Speaker
And that bubble, that dating app, would be considered, quote unquote, a dead internet theory. But it's like, if you look at Twitter, I think Twitter interaction, when it comes to bots, like close to 70%. I'm going to look that one up. So when you were talking about that, that's the first thing I Yeah, yeah. I think it's around there. I could be wrong, but i know it's over half.
00:40:04
Speaker
I know it's over half. So basically, X is just a dead fucking site. I mean, there's a little bit of interaction, but nothing... nothing substantial, nothing ah of ah human value, unless you're unless you've already been on there and you're already only dealing with people you've already only ever dealt with, so.
00:40:24
Speaker
right, here we go. The exact number of bots on X is unknown, but estimates range from 12% to 64% of all accounts. Well, exclaim X claims, X claims bot activity is less than 5%. Here's a breakdown of what is known. X claims, yeah, less than 5%. Independent estimates, factors, influence, and estimates.
00:40:44
Speaker
It's difficult to determine the exact number due to the evolving nature of bot behavior and the challenges of differentiating between human and automated accounts. there's a perception that bot activity has increased since Elon Musk.
00:40:55
Speaker
Now he's a tool that, uh, bot acts to impact the bots. Bots can spread misinformation, manipulate public opinion and disrupt conversations, making their presence a concern for users and platform integrity.
00:41:07
Speaker
Now, we're talking about that bot with, uh, the X again, that was, uh, it rebelled, it rebelled against the super tool, the Elon Musk, Elon Musk. Grok. Grok. Which is a weird name.
00:41:20
Speaker
He must be a Patriots fan. I don't know. i don't get the football reference there. Gronk, not important. Fair. I forgot who I was with. Well, I hear Gronk, and i'm like I'm like, I am Gronk.
00:41:33
Speaker
um just It's so dumb. It's just blah. Hill, people. am Gronk.
00:41:44
Speaker
Oh, man, I lost it. It was there. and that was It's a dead internet, man. the dead internet. So, misinformation whatnot.
00:41:58
Speaker
See, I have no problem spotting those. At first, um some of the YouTube channels I like to pop onto, um I didn't realize at first it was a bot. Until certain words, i was like, no wonder what nationality is. Is he faking it?
00:42:11
Speaker
Then I realized there's only like five or six different voices so far, I think. But I'm not sure. But I mean, there's one he's on this, he's on that. And he does a lot of the ditty videos. And they just rehash and rehash and rehash um stuff that was on. ah When Russia invaded Ukraine, I was glued to videos of that.
00:42:32
Speaker
And I was on one channel and it was the same voice all the time. Same voice all the time. I said, man, it's so weird the way they do this. It must be like a foreigner just putting it into some kind of like a English translator.
00:42:44
Speaker
Because, again, idiom is weird like that. Yeah. What he's reading directly. I was like, man, that's just so strange. Then I realized this guy's voice is on like seven different channels. What in the fuck? Yeah, it's fake.
00:42:55
Speaker
It's fake. It's all behind the veil again. Knowledge of power and pain. few months ago, ah somebody wanted a they were releasing a ah single. They wanted an album cover done for their single. And they reached out to me to do you do a quick AI thing. Because somebody told them I fuck around with the AI.
00:43:16
Speaker
I fuck around with AI a little bit when I'm, if I, as a tool, if I can't find a purple bong on the internet that I want to use for a, for a collage or mosaic I'm putting together, yeah I'll use AI to generate one, whatever.
00:43:31
Speaker
But I'm not going to do, I'm not going to do a whole, what's that, Shane? I'm not going a whole thing. Whatever. I'm not too worried about it. But, but i felt ah But when this dude reached out to me, they're not that far away. i was like, man, this would be a great opportunity for me to bring my camera down.
00:43:47
Speaker
I'll shoot in the exact spots you want. I'll design your album. I'll do it for free because just want the work. Take credit. He went to somebody else and they just generated AI real quick.
00:43:58
Speaker
I was just like, whatever. I just... I'm like... Because it's it it's our society in instant gratification, right? that this was it We need it now! yeah well That's the thing. i was i was He needed it within like a week. and i was like, man, I can come down tomorrow.
00:44:15
Speaker
you know I'll make the drive. now you wanted something boom but Of course, this is a young kid. and it's Now, now, now, now. But AI is... that's That's where we're going to fuck up with ai I want to turn that on its head for you.
00:44:32
Speaker
How would that kid feel about AI making an album?
00:44:39
Speaker
true That's what i don't like. I don't like AI music. I don't like AI art at all. I mean, it's a funsies, sure. yeah Commission a goddamn artist, a real person that has soul and talent and skill, and they love what they do.
00:44:54
Speaker
So some of some of the some of the the outgoing credits that we use is AI generated, the song. I'm not going to lie. i did that. And a couple other things for the intros. But for the intros, I've kind of stopped using AI written songs.
00:45:08
Speaker
Because one, after a while, to start paying subscriptions if you want to be good. And two, i decided I was like, feel like I'm getting lazy with this. So... I'm getting lazy with it using AI. So I'll use, I'll use, i I'll, I'm still not paying, but I'll pay attributes. I'll use common creative stuff from freesound.org or I'll use common creative stuff. That's been, you know, that's not copyright protected anymore.
00:45:37
Speaker
other art and stuff and kind of build something. But AI just doesn't have the spirit. You know, it's so funny. It's so fun. You know, it's so funny is I'm a person that doesn't believe in a soul or a spirit or anything like that.
00:45:51
Speaker
But since the introduction to ai I've actually started reevaluating what I would consider beings. soul. I still don't want to sit there and soul any sort of supernatural. Okay, juxtapose soul for passion.
00:46:10
Speaker
their Passion, essence, something like life. Yeah. yeah there's Yeah. We have become a microwave oven society.
00:46:21
Speaker
Pop it, go. That's a... I like that. i like that yeah Microwave society.
00:46:30
Speaker
Interesting. yeah That is not wrong. That's not wrong. All right.
00:46:38
Speaker
Give me one second. i i'm gonna I'm being paged. I'm being paged. I'm going to turn my camera off real quick. I'm going to tailor my argument a little bit here. um If you have a small budget or a small operation, very small budget, I can understand not wanting to pay somebody.
00:46:54
Speaker
But at the same time, once you... maybe hit or get your budget up. You've got to pay these artists. Art it its is life. Art is joy. Art is pain.
00:47:08
Speaker
all is Art is human. You can't manufacture humanity. You just cannot do it. If you don't agree, i want an argument like a six-year-old so I can understand it.
00:47:20
Speaker
I don't get how you just don't see that. You can't That's the word i'm looking for. You can't ah outsource love. You can't outsource.
00:47:34
Speaker
Do you feel me? I heard some of that. I was talking to someone. um and know saying If you're ah if you're a small budget operation, I get that. But once you hit or whatever, or get your budget.
00:47:46
Speaker
Yes. Yes. if like as soon oh man As soon as we get our like, share, and subscribe guys. please see as we get As soon as we get our numbers up and we are monetized and we are, you know, we are able to get better um um sponsors and get paid, I would love to like um commission some of the artists that come up on Glick's House of Music for some for some stuff. Absolutely. And Glick and I have talked about that, but but it is because we are a very small operation still, so.
00:48:18
Speaker
But I've been trying to understand, but we know enough small budget artists. We could talk to somebody that would do it for us. I know. Yeah, maybe. Yeah, maybe we we might be able to strike a deal or something like that. But,

AI's Transformation of Labor Markets

00:48:29
Speaker
you know, but you're, you're not wrong. I agree with you. And I know you and I have talked about doing some offline projects together when it comes to like printed material, comic strips whatnot.
00:48:39
Speaker
And you and I are not artists. We do not draw, but we, ah we can write, we can crack jokes and You more so than you. I'm a dick joke artist. ah I just draw cocks.
00:48:52
Speaker
That's all I draw is cocks. Penises on books. But that one character from Superbad. um That's right.
00:49:02
Speaker
You're the penis kid. Anyway. um Oh, but yeah. So, no, I agree with you. I agree with you on And then that last point you made as I sat back down in the chair, wanted to reflect on that.
00:49:15
Speaker
but talking I like that word authenticity here.
00:49:25
Speaker
Yeah. I can get with that. i can get behind that. Authenticity. But there's humans that aren't authentic though. But the NPCs. That's different. Explain what an NPC is just for folks who haven't caught up on that.
00:49:39
Speaker
Non-playable character. it's an it's an old it's ah It's an old internet meme. But basically I like to If you guys are familiar with Grand Theft Auto, you like you're the player.
00:49:50
Speaker
But everybody around you is just programmed to just, quote-unquote, live. Walk up and down the sidewalk, go to work, buy shit, click a button, whatever. Like seven different phrases for the people. There was one from game that was...
00:50:06
Speaker
one from a game it was um an Elder Scrolls something like that. Dragon Age? don't remember. But there was one of the NPCs, and it would glitch sometimes. And he's supposed to lead you somewhere, but there's a corner.
00:50:19
Speaker
And sometimes he'll hit just that edge of the corner and just be walking into the corner forever. Forever. yeah yet It will never change. They don't make choices for themselves. I'm stuck, but and that's what the programming says. I've got to take this tack.
00:50:33
Speaker
Yeah. So I'm going to โ€“ this is and experience that recently happened to me at a local store. And it sounds petty on my part, but follow me. Story time with Josh.
00:50:46
Speaker
but Right. So follow me for a second. Follow me for a second. So I walked โ€“ so there's this new store that opened. And i walk I went in there one day just to buy a beer, just a single beer. and all And I work at a liquor store, so I understand the the laws of selling alcohol in the county and the state.
00:51:05
Speaker
And so anyway, so the lady starts to put that beer in a bag. I'm like, no, thank you. I don't want the bag because it's a beer. I can carry it with my hand. Why waste the plastic? I don't, I don't, I lived on the West Coast too long. We give choices if you want a bag or not. So waste less.
00:51:21
Speaker
Anyway, I live in a community where everybody's automatically gives bags. was like, ah, stop. Anyway, so the lady looks at me she goes, it's the law. I'm like, no, it's not the law. It absolutely is not the law.
00:51:34
Speaker
She's like, well, we have to. That's what my manager said. i was like, oh, so now the script's changed. It's our company policy. Basically, yeah. So it changed to that. i was like, well, I don't work for your company. I just purchased the spirit. my property. now I get to choose if it's in a bag or not.
00:51:51
Speaker
She's like, sir, I can't let you leave here with that and that without a bag. I'm like thinking to myself, was like, Yes, you can. What? This doesn't make no fucking sense. Right.
00:52:02
Speaker
And I'm just like whatever. So I grab the beer with with the in the bag because she won't fucking take it out. And I pull, i pull, what's up, Arliss? I pull the beer out and I toss the bag in the trash on the way out the door.
00:52:14
Speaker
Whatever. And anyways, I would have let her put it in there, peeled it right down, picked it up and left it on the counter. I thought about that. But my thing is she's so worried about following the rules that's been established by her boss that there's no thinking on her part. Her program.
00:52:37
Speaker
Her program. i mean I'm challenging her programming and she's over there glitching out. Like walking into the wall. And I'm afraid that's independent thought and critical thinking skills are not taught in schools anymore.
00:52:51
Speaker
They are by, by, by very good or great teachers. Yes. But by and large, that's not the idea of school. I think, I think it all depends on what school, like the school system. My kids went to an Oregon. I remember one of them coming home with a list of all like logical fallacies and shit, dude. And I was like, yay. yeah Yeah. But you're right. they're out there More or less. They're not common.
00:53:11
Speaker
it's it's it's not It's not that where're they're they're not taught it. It's also that they're never encouraged to use them for jobs because because companies the company's goal is is is a profit margin. It's ah it's a line.
00:53:29
Speaker
It's profit. And the more streamlined that company is, the more profit it makes. And streamlining means employees follow the fucking rules and you go by script.
00:53:43
Speaker
So we're being streamlined for profit and people just sit there blind to it. in Anyway, i want ai to t i want AI to do that job so those people can leave behind that counter and go think for themselves and enjoy art and do their own art and be happy.
00:54:03
Speaker
Because...
00:54:05
Speaker
Basically, they are the AI now. They're standing by on that counter programmed to just do a job. Fuck it. That's what AI is That's what AI now is. Program to do a fucking job. Use computers to do computering, not humans to do it.
00:54:21
Speaker
Let people people. Let

AI in Music and Entertainment

00:54:23
Speaker
people people. Let computers computer.
00:54:28
Speaker
Anyway. We're almost at an hour. Can we take a short break? I got pee pee. We got a pee pee. Yeah. Or did you want to go? the Okay. Yeah, we can do that. i need ah I need something. I need something to drink anyway.
00:54:39
Speaker
Yeah, right on. ah Let's play some Southern Outlaws since Arliss is hanging with us. Fair enough. Fair enough. You got it geared up because I got it right here. Is it Watch You Burn? it Watch You Burn yet?
00:54:50
Speaker
Yeah, yeah, yeah.
00:55:07
Speaker
Only a cow will pick up a gun, shoot up a crowd, try and have fun. Now the Vegas lights, they won't lose their glow.
00:55:19
Speaker
The band will play. on with the show. You're going to get your turn. Yeah, you're going to get your turn.
00:55:32
Speaker
You're going to get your turn.
00:55:40
Speaker
I wanna dance I didn't see I had friends in your company I can stack my bench I can flip a switch Make that last bullet burn You tired of a bitch You're gonna get your turn Yeah, you're gonna get your turn
00:56:11
Speaker
John, you gotta give your turn.
00:56:46
Speaker
of you, don't you go too far, just know this, let it give you a pause, or you melt your bombs, won't trigger in a synagogue, you're gonna get your turn, yeah, you're gonna get your turn, son, you're gonna get your turn, the devil's gonna watch for you,
00:57:54
Speaker
Hey, Josh. I'm going to throw a link out for our list. We can slide in for a few minutes and talk to us about or talk with you about AI music. that cool? Yeah.
00:58:04
Speaker
Yeah, absolutely. that feels ah I'm experiencing lower intestinal discomfort. I've got to go back to the bathroom. You're welcome, audience. You're welcome. ah Age, as you get older, life sucks.
00:58:16
Speaker
sos
00:58:19
Speaker
Where is the thing? shit. Where are you were you dropping it at? I'm going to send it right to him. Okay, cool, cool. i was like, don't drop it in and the chat. No, no, no. How the hell do I do it? i can't remember.
00:58:33
Speaker
You copy and paste it? hipper i don't know if I don't know what you're on. You're on the laptop, mobile. Where is it? It was a presenter. don't know.
00:58:46
Speaker
Oh, wait a minute. Here it is. It's this one. Boom. Yeah. Copy. I don't know.
00:58:55
Speaker
Crap. Oh. Keep things going. I'm too stupid to figure this out. Yeah. I was in a earnest comment. Some people need AI because their brains are compromised.
00:59:12
Speaker
I would say I agree with Actually, you know, and I'm not even mad at that. I don't expect all humans to walk around and be fucking calculators. Like I hear this, ah something you know, scrolling through, gen i call it generational tribalism. i think it's stupid.
00:59:30
Speaker
The fucking young generation today, they got it easy. You know, they don't even count money back. They don't even have to learn how to do change. The computer tells them. Well, course, man. Because I know, Sure.
00:59:41
Speaker
in sure Maybe they can do the math, just not as fast as a calculator or that computer right in front of them, but really, who cares? Like, you're getting your change back correctly, even more efficiently now.
00:59:56
Speaker
Instead of relying right relying on people to count, and if you think about it, you think all these cashiers throughout history before calculators, computers were always getting the change, right?
01:00:07
Speaker
Especially when the customers that they were dealing with were more likely, more illiterate than them. They were probably getting getting over on. I'm just saying. So I'm just rambling at that point.
01:00:21
Speaker
What I don't like is the self-checkout. I hate self-checkout. I noticed that when we all went to Walmart that day. um I like self-checkout.
01:00:33
Speaker
I absolutely enjoy it. When it first came out, they would have ah like one cashier teaching people how to do it. And she said, no, it's really easy. Come check it out. And I said to her, I cannot believe you're willingly handing away your job.
01:00:48
Speaker
Eventually, people don't need to learn it anymore. Then what are you going to do for work? Use your head. ah Leisure. That's the whole thing. It's not about what people are going to do for work. It's what are people going to do in absence of work.
01:01:05
Speaker
And here's the thing. And I'm not saying everybody just stopped working. I'm going nobody ever works. But cut hours back. I mean, why fucking go to work eight hours a day when we literally were only productive for like four or five of them? Cut that shit back.
01:01:19
Speaker
In a while, when we're ready to close, I'm to have diatribe on that part. Try to write it down or something. The brain is a muscle. If you don't use it on a daily basis, it gets out of shape.
01:01:31
Speaker
Yes, but also you have to understand like you're your brain is fueled off glucose and ketone. And when it's low on glucose, it uses ketones, which ketones made by your liver. So if you got to show your liver, it's insulin.
01:01:47
Speaker
There was this thing I was watching recently on insulin and Alzheimer's and how, how our brains ah fuel system works and the way our cognitive ability fucking slows down over, over time because of our insulin production is all janky and it affects our cognitive ability. That's like one of the, one of the, like, it's like,
01:02:06
Speaker
The common factor amongst a lot of like cognitive disabilities is your metabolism is insulin. And this was from a met metabolic scientist. have to find that video and send it to you.
01:02:18
Speaker
But yeah, i think you're right, though. The brain is a muscle. And just like muscles, if you don't take care of them and work them out, properly fuel them, cetera, and stretch them, keep them elastic, you're right. People tend to get dumped.
01:02:32
Speaker
Can you do me solid real quick, Josh? I i texted you Arvis' number. Can you copy that and shoot it to him there? and what Yeah, let me... No, you're good. Let me... I got a... I got a copy.
01:02:48
Speaker
Oh, God. Well, no, i have i have to I have to copy this and send it to my phone, but I can do I get it. The only time self-checkout doesn't work out for an individual is when you try and check out one's own b-hole. Yeah.
01:03:02
Speaker
Self-checkout, self-checkout. How do you look at your own b-hole? ah Stretching, lot of stretching.
01:03:13
Speaker
Calisthenics. Calisthenics. The other big thing for me about self-checkout. I'll tell you a story. I tell it on stage once in a while.
01:03:25
Speaker
I was at a Walmart because, you know, I got that Walmart money. yeah And I got an Amish couple in front of me, or I'm sorry, an Amish couple behind me and one lady in front of me, and it's a regular lady, not an Amish lady.
01:03:38
Speaker
She's got two carts full of stuff. i'm holding I'm holding nothing but two big-ass bags of family-sized peanut butter. I'm sorry, peanut M&Ms because I eat healthy.
01:03:49
Speaker
Now, she looks at me and says, oh, I'm sorry. I'm in a hurry. I can't let you go in front of me. I said, I don't mind. got plenty of time. And then she looked at me and says, well, you know, you can go over the self-checkout.
01:04:02
Speaker
And I said, no, I can't do that. I don't work here.
01:04:07
Speaker
And then she says, like, just like this, she's like, um, you know, you don't have to work here to use self-checkout.
01:04:18
Speaker
Like I'm an idiot. Now I'm pissed off. was like, yeah, but the point is I'm not getting my employee discount. Will you please just pay attention to your transactions? Why are you continuing this? You golden idiot.
01:04:29
Speaker
I got a question for you. let's let's say Let's say you were offered a discount if you use self-checkout. Like a 5% discount. Because they're sitting there saving money on labor, right? And not employing people.
01:04:42
Speaker
So shouldn't you be able to... That saving on that labor should equal savings on somehow on the groceries you buy. But it doesn't. goes The profit goes... The way we treat profit just shoots up at the top.
01:04:55
Speaker
Why should we ever come back when they can get a cut up? Exactly. But I think that's not a bad idea. it's The soulless motherfuckers at the head of them are the worst human beings on this fucking planet.
01:05:07
Speaker
Arliss, thank you. i'm going to drop down and let you two handle this. I'll be back. e You got it. Thank you, Arliss. Good to see you, brother. Hey, same with you. What is up, my man?
01:05:19
Speaker
Not too bad. i got a cover for for my brother there. How's things been going with you, boys? It's going well. It's going well. Today's topic is AI. and What's your opinion on AI and it's um and its influence or use in the music industry? um Well, there's good

Intellectual Property and AI

01:05:41
Speaker
aspects of it.
01:05:43
Speaker
um For like one incident, of Randy Travis wrote a new song, but of course he can't sing anymore. And um they used AI with clips from his voice and then they entered in his new song and the guys in the studio, you know, played the music and they were able to put it together, basically giving him a voice again.
01:06:07
Speaker
And that aspect, I think it was amazing, you know, because, uh, here is an icon and, and country music and it was given him a voice because of his illness.
01:06:21
Speaker
Um, on the negative side, uh, which Spotify finally took it down. Because there was lawsuits against it. But somebody created an entire a i band.
01:06:36
Speaker
And um they had over a million streams. Which is taking money from the hardworking artists. And, you know, that was totally wrong.
01:06:49
Speaker
So it's just like any kind of technology. There's good aspects of it. but it can also be used for evil aspects too, just like anything.
01:07:00
Speaker
i yeah So the Randy Travis thing, I completely 100% agree with the use of AI in that way. For the reason you had mentioned, ah it brings a voice to somebody that lost their voice. It's still his art though.
01:07:16
Speaker
And I think that's the important, it's still still his. The human essence that we were talking about earlier is still within those words and and all that. yeah But when you, when it's relied, when the AI is being relied to, when the AI tool, it's like expecting Hammer to pick itself up and hit a nail.
01:07:37
Speaker
Maybe one day it might be able to do it, but it's going to be sloppy. Yeah. Yeah. like Like I said, is it's just like any technology. There is good, good things. I mean, um you know, for it,
01:07:52
Speaker
But it's just like anything. Anytime there is technology that is has good intentions behind it, somebody is always going to find a way to you know use it for evil purposes. yeah you know i I do want to, you know, and i'm I'm such a rude, rude host.
01:08:14
Speaker
I want to give ah quick introduction. This is Arliss Walker. He's from the Southern Outlaws Band. Right there down at the bottom, you can read it. he's He is a southern southern rocker, ah like to say. And he came up to cover for Michael for a moment talk ah AI. And and it's... ah He's a musician, so I was curious.
01:08:41
Speaker
I asked Michael how how he felt AI would but affect the comedian industry or the comedy industry. Stand-up in particular. But we didn't really touch on Hollywood yet.
01:08:55
Speaker
But that's... Because you had brought up the Randy Travis thing. that's ah That's a big, big discussion Hollywood right now, especially with actors and actresses that have already passed away. Well, yeah. Like I said, i mean you know it gave a basically a voice to Carrie Fisher, say, in Star Wars.
01:09:18
Speaker
It's about the... about who owns the rights to the voice and the images and compared to these people's estates and family members that have- Well, I would say it would go to the intellectual property.
01:09:31
Speaker
um Just like with musicians, we have mechanical rights. here Okay. um So it would be, to me, it would go to her estate, you know would own that intellectual property.
01:09:47
Speaker
So any rights or royalties and stuff that need to be paid from that needs to go, you know, should go to the Carrie you know carrie Fisher's estate. Just like if she was getting paid as, you know, her full being an actress, it'd be no different.
01:10:03
Speaker
ah As long as, the you know, Hollywood, you know, isn't taking, you know, the cut out of it. But we know we know when it comes to CEOs, they're going to want to maximize their profits as much as possible. So they're going to probably fight to keep it out of the hands or the intellectual property out the hands of the estates.
01:10:25
Speaker
Right. And that's where that's where we need to pass legislation. You know, that's what we're working on in the music industry, passing like legislation that protects that, you know, intellectual property.
01:10:38
Speaker
And we also have legislation that is getting ready to go through that, you know, we're hoping that it'll pass, which is called the fair play act. It's, you know, where, um, there's too many of the internet base, uh, radio stations, uh, and there's some AMF and AM and FM stations that are not playing the royalties to the artist every time they get spent, or they're not paying a fair wage to, you know,
01:11:08
Speaker
you know, bait well goes back to their inner, you know, intellectual property. um So, you know, it comes down to passing laws that's going to, you know, wind up ah protecting that for the artist, whether it be musicians, comedians, ah actors, you know, um we just need better legislation that's protecting, you know, what we do.
01:11:35
Speaker
How does that affect, like, for instance, I know some people have a karaoke license in order to play copyrighted songs for karaoke or be a DJ or whatnot. How is that going to affect that? Well, they pay into ASCAP and BMI is what it is.
01:11:57
Speaker
Okay, because because cause you had said, you had brought up the the internet DJs. From what I understand, the way they get away with that is by getting these DJ licenses and doing it online.
01:12:08
Speaker
Would that affect their profits if they're licensed to play that music? Or is this legislation to go after the people that are playing it without licenses?
01:12:23
Speaker
It's mainly to wind up going after the people that are um without a license. Okay. That are not paying into ASCAP or BMI. um You know, and that's another sad thing, you know, as far as like independent musicians, because you, you know, if you have record labels behind you, they already know to have everything, you know, copyrighted, published.
01:12:47
Speaker
But also they, the label itself as members of BMI, um, and all that a lot of independent artists that put their music out there don't realize that they also can apply and be a member of either bmi ascap and the other one that they definitely should is the mlc which is the mechanical license coalition and those companies actually go out Anytime um our music is meta tag. So anytime that our music is played, they collect the royalties off of it.
01:13:24
Speaker
Unless those people have permission to use our music. i was going to say, because we play yours. and Yeah. And you guys have our full permission, you know, to do that.
01:13:39
Speaker
um But, you know, because the moment we're we're able to pay the royalties, I think we should. Yeah. Well, like I said, you know, um you know, you guys are friends. You guys are supporters.
01:13:53
Speaker
And to we're not like other artists that, you know, want to nickel and dime over everything. um With you guys playing us, that, you know, piques interest for other people to go check our music out on other platforms. And that's how I look at it.
01:14:10
Speaker
And we don't get paid a lot of money through streams and the platforms anyways. Um, so to us, it makes people interested in the music and to say, Hey, when we come to your town, we want to go check these people out live.
01:14:27
Speaker
That's where the money's at. Okay. They're going to pay money to come to it, you know, buy a ticket, go to a show. They're going to buy merchandise, you know, and that's where the true money a lot, you know, lies.
01:14:38
Speaker
Um, so with people like you playing our music and that, and, and supporting us to me, you have full permission that you yeah use that in anything because, you know, that helps us out in a different way.
01:14:55
Speaker
Awesome. Yeah. Yeah. I, it, the it it's promotional. So now I do want to bring it back to AI real quick. Yes. I think the legislation to curb AI in the music industry is definitely a help.
01:15:10
Speaker
I know, I know there was some legislation being passed or talk about being passed. I have to to to limit states' regulations on AI, which since we're talking about legislation, I didn't want to go into that too far. We don't try to go too much in politics. But I do want to go back to have you used AI in any form or fashion?
01:15:35
Speaker
I mean, so separate from โ€“ I mean, just like โ€“ to search something whatnot. I'm not sure. Oh yeah. I, yeah, I've used it and stuff, you know, to search things, um, you know, um especially when you have an idea of creating, you know, like either cover art or anything else you want to make sure that, you know, Hey, has anybody used this idea before, um, you know, as a search engine, but to use it as an artist, no.
01:16:08
Speaker
That's a good idea. There's often times where I'll think of a title for a show or an essay or something, and I'm like, oh, is that title already used somewhere? And i'll do ah I'll do a Google search. And, of course, your search engines today are using AI.
01:16:26
Speaker
Like you open up your Google search, and you whatever you search for, and there's that AI mode always at the beginning. Yep. yeah But, you know, using it as a search engine and stuff, I don't see, you know, again, that goes back to the the good aspects of It's a tool.
01:16:46
Speaker
Yeah, it's a tool, you know, um to use it to to me to create like what Mike does comedy um for what we do.
01:16:57
Speaker
Music, um I think is wrong. because it takes it away from the intellectual property of the hardworking either comedians, musicians that are taking the time to come up with either of the lyrics, you know, putting up music with what Mike does, you know, ah coming up with their set list, you know, of, of jokes and everything.
01:17:19
Speaker
Um, it, you know, it kind of, I guess it's a cheat. Is that a good way to put it, Mike? It's definitely a cheat code. Definitely. Yeah.
01:17:31
Speaker
You know, they're just taking the easy way out instead of working, doing the hard work that goes behind it. I'm going to call out a challenge right now. If you can make up AI that's funnier than me, I'll look at my own butthole.
01:17:49
Speaker
You do that anyways, Mike. pivotal so check out he's got He's got this curved telescope in his room. Yeah, he's got mirrors like everywhere, you know, and you know he's always got to be looking at it. I got to share something with you guys since you brought up Carrie Fisher.
01:18:09
Speaker
ah Peter Cushing's like this was used in Star Wars Rogue One. He had died. They obtained the permission in 19, I'm sorry, they obtained permission from Peter Cushing's estate to use it.
01:18:22
Speaker
The estate was paid 33,000 pounds. Now this dude, his name's Peter something. He runs Tyburn Film Productions, T-Y-B-U-R-N, Tyburn Film Productions.
01:18:35
Speaker
ah Peter Cousins liked this lawsuit. they He sued Disney for using the likeness from his appearance in Star Wars Rogue One, proceeding to trial on this. Disney's being sued by Tyburn claiming they had an agreement.
01:18:47
Speaker
The Tyburn people had an agreement going back to 1993 that his likeness could not be used without their consent. He signed it, he says. ah While the judge didn't dismiss the case, stating it was not unarguable, they also indicated they weren't convinced Tyburn would ultimately succeed.
01:19:06
Speaker
I don't think the guy has the contract. he did He's just claiming it.
01:19:12
Speaker
I guess that's what the court case will prove or

AI's Impact on Hollywood and Music Industry

01:19:15
Speaker
not. I didn't know that. didn't know that thing because I remember reading about that. And I guess we can use this to segue into AI in Hollywood. We had already talked a little bit about it when it comes to intellectual property.
01:19:26
Speaker
I think that's all ultimately, I think the biggest conflict is, or the biggest is intellectual property when it comes to AI. And that should be a no-brainer though, the estate. mean, that really should be a no-brainer. The fact that we would need legislation, this is, this is, I know.
01:19:44
Speaker
Anyway, I'm not going to go down. but Well, no, I mean, it comes back to, you know, common sense. And it's sad that you have to have, you know, laws that have to be made govern it.
01:19:55
Speaker
It should be common sense. But if it wasn't for those laws, people are going to take advantage of it. You know, your CEOs are going to try to pocket most of the you know the money instead of the estate getting anything.
01:20:08
Speaker
Isn't there wasn't there like in this is just. My ear to the ground, and this is, of course, from decades and decades of old bad business practices within the music industry.
01:20:19
Speaker
There was a rash of artists like signing away their their intellectual property. The big record companies and shit, losing it out on a lot of money, white ticket prices became so It's a whole kickaboodle, but it's all because intellectual property rights.
01:20:37
Speaker
and May I share something with the with the group? Yeah, absolutely, Michael. Welcome back. Just for fun, I went to the AI overview on Google and Googled me. This is what AI has to say.
01:20:48
Speaker
Michael Kopenhauer is a stand-up comedian from Warren, in Ohio. Fact. He is known for his dry wit and offbeat storytelling, which he delivers with a uniquely chill style. Kopenhauer's comedy often turns everyday awkwardness into relatable humor.
01:21:02
Speaker
Kopenhauer has performed stand-up comedy across the country from Savannah, Georgia to Las Vegas, Nevada. He has been featured in podcasts like Two Joke Minimum. and fireside chats with Blake. He's associated with the Ergasm Comedy and has a new podcast called One Joke at a Time.
01:21:16
Speaker
That is not new. It is not coming soon. We did it through two, three years ago. but now It stopped. Copennaver regularly performs at the Underground Lounge in downtown Ward, Ohio.
01:21:27
Speaker
Recent projects. He recently participated in a Star Wars-themed comedy show called Use the Jokes, benefiting All About the Paws, Dog Rescue, and the Cleveland Winery Vino Veritas Cellars. He is also involved in the Nonsensical Network. What's up? Oh, damn.
01:21:41
Speaker
Which produces sorts of nonsense and chill and chronic contemplations. wow Nice. I just want to do that for fun.
01:21:51
Speaker
Now, a friend of mine, right, he was about to go to court, and he asked me to write a ah recommendation or like a, whatever you call it, like a letter of he's a cool guy, whatever.
01:22:02
Speaker
And I told him, sure, no problem. And he ended up just using ChatGPT to generate nice people's comments about him and signed my name to it. I didn't care, but I didn't write any of it.
01:22:16
Speaker
Oh, no. Long story short, he gets out in November. um I hope you just didn't incriminate him.
01:22:29
Speaker
He may be getting out in November.
01:22:35
Speaker
So um what Arliss was talking about earlier with the AI reproduction, reproduction of like voices and such, that's not terrible to me. That's not AI creating something. I do know... What I hate is the original art produced by AI. That's no good. Fuck that. like When they did Tupac at Coachella, like shit like that, I'm good with it.
01:22:57
Speaker
As long as the proper people get paid for the intellectual property. yeah Because like Tupac, like iconic music, that shit lives on forever. And we're going to use technology to relive those artistic experience, historical experiences as well, I'm sure.
01:23:15
Speaker
don't see nothing wrong with that. It's a hologram. yeah wait In that essence, but you combine hologram and AI technology. It's probably going to get loads better later.
01:23:27
Speaker
where we I have one thing to build off Mike. um and He's my brother and I love him, but I have to disagree when it comes to about the voices. I personally actually know Jim Cummings.
01:23:41
Speaker
And Jim Cummings is one of the most famous voice actors that there is. A lot of people don't realize that most of Disney movies and everything else has been done by him.
01:23:53
Speaker
ah Most famously, Winnie the Pooh. He is the voice and has been for several years now, the voice of Winnie the Pooh.
01:24:05
Speaker
They've been using to that to,
01:24:10
Speaker
And um make Winnie the Pooh say some really nasty stuff. A lot of people think it's Jim Cummings doing it. He got a lot of hate mail over it.
01:24:23
Speaker
That's people being duped by AI bullshit. Just AI slop. Like there's no... but um That's just like meme. That's like meme making. I mean, it's... Nobody's doing it to make money. They're just doing it to be dicks on the internet.
01:24:40
Speaker
but But he had to cancel a lot of his appearances because of that, because the venues didn't want him to come in because they were getting protests, you know, against him because of that.
01:24:54
Speaker
And it wasn't him, you know, doing it. It's ai That goes back to the unintended

Deepfakes and Ethical Concerns

01:24:59
Speaker
consequences. What I was speaking on was, Arliss, what you were talking about with the the artist who lost his voice.
01:25:05
Speaker
Mm hmm. See, what they're doing is stealing intellectual property. They had no permission. It's his, and your guy, it was his intellectual property.
01:25:16
Speaker
He couldn't speak. He was still writing the music. he was still playing the notes. He just can't sing. And that is A-okay with me. Yeah. Yeah, I wasn't talking about stealing it. really It was his IP.
01:25:28
Speaker
He used it to continue doing what he loves to do, even though he still can't do it. That is awesome. that's yeah but So but what Arliss was talking about was more in the vein of deepfake.
01:25:41
Speaker
Yeah. yeah that's yeah now Now, this is my problem deepfakes when it comes up against free speech. There's going to be an issue there. and I don't know which way it should break for free speech or not free speech. when when like doy Do we outlaw deepfake videos like that?
01:26:02
Speaker
to because because of the negative negative consequences? I would hope not. But if somebody if somebody's career is being being us a ah you know torn down because of some stupid deepfake, well then something needs to be done about maybe that particular deepfake. I don't know. There's got to be a solution there, but i don't know if it's restrict for you. It would wind up coming down to where almost everybody would have to get their voice copyrighted.
01:26:34
Speaker
Oh, wow.
01:26:37
Speaker
Hmm. but That's interesting. You know, that's, yeah you know, that was, that's the only way that it would govern where these people and stuff, you know, they have iconic voices without them being, you know, used and have legal or implications and stuff for them to go after these people is if they copyright their voices.
01:26:59
Speaker
And it's a shame that it would come to that though. But if I take, let's say, Ryan Reynolds' voice, even though he has it copyrighted, it's copyrighted. But I use it in eight to an AI-generative program to map his voice over vulgar stuff that I write, but I'm not doing it for profit.
01:27:18
Speaker
It wouldn't matter in his case. He's vulgar anyway. but my True. He'd probably laugh it. But my point being, what what legal authority does he โ€“ have to come down because because I'm not doing it for profit. it's just It's just a meme on the internet.
01:27:35
Speaker
Well, it wouldn't matter. Defamation? Yeah. how and how's the delamation How's it defamation? For instance, in Arliss's case, the fellow is known for being kid friendly and he's a family icon.
01:27:48
Speaker
Now people are making him look like a cunt and it affected his bottom line. It affected his career. career yeah That is not okay. It affected his personal life.
01:28:00
Speaker
He's afraid someone's come up and stab him in signings. That's fucked up. but is that what Is that the definition of defamation? Defamation of characters. When you make somebody look bad ah by but ah by a fraudulent information, you lie.
01:28:16
Speaker
Say, Arliss Walker, I saw him touching a the dog's butthole. Oh, no. and And then it just goes viral. Now he's the dog's butthole toucher. But it's it's use the shit but it might have affect only but it's only, it's a lie in the same facet as a fiction story is a lie.
01:28:34
Speaker
Like it's not real. This is, this is, this is, this is because it's the just like it's is's artistic expression too. I hate to say that. That's what I mean coming up against free speech.
01:28:47
Speaker
Like that, because they're not making a specific, nobody's coming out in their own voice saying this person is this thing. They're taking AI-generated voice and they're producing, albeit bad jokes, but jokes nonetheless.
01:29:06
Speaker
But when those jokes are hurtful and they affect your representation of character, I'm gonna give him an example. It's not a bot example, but um Brian Reynolds' wife, what's her name?
01:29:19
Speaker
Blake Lively. like yeah laid Lied on this director Said he was doing this, he was doing that He wass doing the other thing And he lost up a couple of directing gigs And he took her ass to court For all these lies Because it's defamation of character That is, yeah But now had she Had she made a video Using his likeness And his voice Saying he did those things And posted on the internet I i It would still be deprivation of character. I'm a lawyer. I believe in this particular case, that that's right along the lines of ah verbal assault.
01:29:59
Speaker
Yeah. like It's not a physical assault, but it hurts you. and for your i mean i it i can see I can see how that's defamation. However, using a cartoon character voice to say vulgar things is an artistic expression. it so But you're still using that's like that person's voice.
01:30:18
Speaker
yeah how But how many how many people recreate Bugs Bunny voice to say bolder shit and nobody ever bats an eye about it? But Mel Blanc is dead. Yeah.
01:30:29
Speaker
It doesn't matter. Somebody's doing it now. Somebody's revoicing it now is my point. But my point is this. People know it can't be his voice because he's dead. Yeah. That's different.
01:30:41
Speaker
It's not going to affect people watching Bugs Bunny.
01:30:45
Speaker
Why not? i mean... i ah we can use we can use We can use another cartoon character voice, like the the woman who does Bart Simpson. I mean, like if that voice has been recreated to do a lot of voter shit.
01:31:02
Speaker
Nobody bats an eye about it. Oh, boy. This is what I mean by you're coming up against artistic expression and free speech. I can understand when ah break Blake Liver... Blake Liver...
01:31:18
Speaker
hello lively rare so for defamation i completely 100% understand that situation with defamation however in the case of the Winnie the Pooh voice that is artistic expression there is a difference there yeah but when it starts affecting with the person's personal life like Mike said now it's gone beyond artistic expression because now you're tearing you' tearing down you know that person's, well, like Mike said. lively I mean, it's artistic expression via theft.
01:31:57
Speaker
Theft. but What do you mean?
01:32:02
Speaker
like Because of that voice? Because of the voice? Because the voice sounds like Winnie the Pooh? Because it's not just a voice. Yeah, it sounds like Winnie the Pooh, but that man, i guess we're just going to keep circling our tails here.
01:32:17
Speaker
no i didn no No, I'm just, umm just i' i'm trying to see I'm trying to see it from your perspective. but but The thing of it is, it's

AI in Law and Ethics

01:32:25
Speaker
a good argument. you know Either way you're looking at it it, is a good argument.
01:32:30
Speaker
He's losing income. He's in fear of his life. That is affecting his personal life. Whether they're getting paid for it or not, they're fucking this man's life up. I didn't know. and so he was He's getting death threats because a cartoon voice of Winnie the Pooh was saying vulgar shit. Walk him through it again. Tell him about the signings and all that again.
01:32:49
Speaker
All let's go. but for what From what I gather, somebody used Winnie the Pooh's voice to say vulgar and obscene stuff, but because his voice is original creator is the original voice of Winnie the Pooh. People blamed him for saying this stuff.
01:33:06
Speaker
I can't sit there and sit there and like stupid people are going to stupid though. Yeah, but that's almost like um there's also an old saying in law enforcement.
01:33:17
Speaker
There's no excuse. Ignorance is no excuse for the, you know, for the law. Yeah. But, but what law are we talking about here? There that's, that's my point. There's no law saying a person can't do that.
01:33:30
Speaker
this is This is what I mean. that Well, that's why. saying But what he's saying is like ah if you walk out of a bar on the street with your beer and you did not realize it was illegal, you're still getting that ticket whether you knew it was legal or not.
01:33:43
Speaker
Yes. Yes. But the people who who took Winnie the Pooh's voice and did vulgar stuff, that's not against the law. There's no ignorance. That's I'm saying. But that's where we need to pass legislation to have laws against it.
01:33:57
Speaker
But see, you're going up against artistic expression, which is a First Amendment right. this is this is why this is This is why I'm like, i'm I understand what you're saying. i I'm concerned about people's safety in the avenue when it comes to stupid people being duped by deep fakes.
01:34:14
Speaker
I'm concerned with that. However, I think restricting free speech isn't the right solution. I just think maybe there's another avenue. maybe Here's the argument, though, for First first Amendment rights.
01:34:28
Speaker
Then use your own voice.
01:34:32
Speaker
Don't use somebody else's. So if somebody would use their own voice but mimic Winnie the Pooh voice and it sounded like the original, would that be but that be the same or would that be different?
01:34:43
Speaker
but that' be good or that would be That would be different. Because for the simple reason you're using your own voice and not using ai to take voice samplings from the original person that did it.
01:34:59
Speaker
Okay.
01:35:01
Speaker
See, now, I don't know. that You brought in that word sampling. and That's going to, like, that's. That might mess up music industry right there. Actually, no, because if we use sampling and anything that we do, we have to pay the mechanical rights and get a mechanical license to be able to use any sample in music.
01:35:24
Speaker
There's some sampling that is, you can get away without doing so you got to slow it down and change stuff. But yeah, that you still you still have to get the mechanical license because if not, you can be sued for it.
01:35:38
Speaker
I think there might be some room. i hu um There is no room there. Trust me.
01:35:45
Speaker
If you use a sampling of any anybody's music, whether you slow it down or anything, the chords and all have been published. Hip-hop's been using sampling for decades now. without any and they And if they're doing it right, they're paying for the mechanical license to be able to use that sampling. I'll have to look into that.
01:36:08
Speaker
Yeah. Anyway, back to AI. harless hunt Where did you get the information on Jim Cummings? Because I'm Googling it, and I can't find anything about him canceling his appearances other than because of the wildfires and California.
01:36:21
Speaker
um It actually came came from him and his him and his wife. Oh, he's divorced. Okay. I'm going to have look into that. Well, yeah, he's divorced, but it was, by an a yeah, his ex-wife.
01:36:36
Speaker
Yeah, there's still friends in that, but yeah. I'll have to find more information. I can't find it yet. I was looking just to see. Yeah, he tries to stay pretty private.
01:36:47
Speaker
But i'm I'm just saying, if it's if it causes somebody strikes in their life, there that should be punishable. But yeah, he's ah he's another one that joined the coalition to try to pass legislation to protect stuff like that.
01:37:02
Speaker
man. All right, Josh. I don't know. I just think it's... ah i just i think I think that's going to be a tricky, tricky line to walk. It's for the courts to decide. yeah I think, honestly, I think something like that... See, I think sometimes we have too much legislation.
01:37:19
Speaker
We should just try each situation in a court of law without having... ass of alleged to over a broad led law of stupid. This is one of those broad laws. I think it's stupid. I can see an argument for and against, but dependent on the specific situation is not every situation is going to be the same.
01:37:41
Speaker
Well, Josh, I got one for you. You have to think of it this way. And this is because of stupid people out there. If you read any bottle that has a warning label on it,
01:37:53
Speaker
Okay. Those warnings got there because somebody sued the company because there was not a warning label on it.
01:38:01
Speaker
And they had to pass laws to put it on there. Yeah, I know. Yeah, but see, but in this instance but in this instance, we're not putting a warning label on for the stupid people who got duped by a deepfake.
01:38:15
Speaker
were We're restricting the speech of somebody else. So it's kind of Like, i understand what you mean by the warning labels. They're there for ah they're there because somebody โ€“ because people are moronic or were we're stupid humans and the government thought we should warning labels and stuff, so they passed the legislation. I understand that.
01:38:35
Speaker
I'm saying that restricting free speech is not the same as putting a warning label something.
01:38:47
Speaker
I don't remember the exact circumstance, but of the great comedian Chris Porter, one of my favorites, love Chris Porter, he was talking about that he believes that legally there should be a defense called, but you shitting me?
01:38:58
Speaker
you shit me. The one who used this like, so this guy's suing the company of a ski maker because he tried to use the ski to help deliver to help deliver a baby.
01:39:11
Speaker
But because it wasn't warning label on there, how am I supposed to know? Are you shit me? Are you shit me?
01:39:18
Speaker
I thought it's a great... I'm telling we are too litigious as a nation. Yeah, we are. The coffee lady. The hot coffee lady. Coffee... we're yeah we We make more laws than we actually have actual court cases because like 90% of court cases are plea deals.
01:39:36
Speaker
There's nothing real. And this you know, at one time in this country, like a lawyer asking for a plea deal was, you were a bad lawyer. And somewhere along the way, that flipped and plea deals are the thing to do.
01:39:49
Speaker
Thanks, John Gotti. Yeah. I think you're right. I think it was John. Hey, if the club don't fit, you must have quit. I've got so much to say about that, but never on the air. um that whole That whole thing is not even... I don't even want to be discussing that anymore.
01:40:11
Speaker
That's so... But it was yeah it was very good very good attorneys. you know And that's why with any of those cases and stuff, I mean, that you know like Mike saying about that hot coffee lady, um they won against McDonald's. I mean, but who in their right mind does not realize coffee is hot?
01:40:34
Speaker
Right. Right. I think there was. i So I know I'm familiar with that case, and there was some nuances being left out, i.e., i e that one, I think the coffee pot was hotter than an actual, like what people would normally expect. I don't know how true that is.
01:40:51
Speaker
there there was ah It was judged to be excessively hot. It was judged to be, and it was something with the placement of the top on the cup or some shit like There was some some not intentional malice, but there was a mistake.
01:41:09
Speaker
There was a mistake. I don't know. Either way, there's coffee's hot. We get that. i got I got you here. She put it between her legs, and this was the old cup toppers for hot stuff that just barely touched the lip like a regular Coke but or like a Coke cup or whatever.
01:41:24
Speaker
Yeah. That's why you have the deep well pop on so they pop off so easily. Yeah. My question is, why didn't they go after the car manufacturer? Was there no drink holder? That's a million dollar lawsuit there. Why don't you have drink holders for hot stuff there?
01:41:40
Speaker
But see, that's what I mean. Like, I think in a situation, you can have a court case surrounding that particular situation without having to then pass a law to ensure everybody knows coffee's hot.
01:41:55
Speaker
Sure. like i I'm okay. What does AI save the law? no yeah No, no, no, no, no. oh We're all in trouble, Mike. So i this has gone a lot longer than I wanted to, but let's talk about AI and building laws using AI for the couple minutes. Building laws?
01:42:15
Speaker
Yeah, what if AI, what if, okay. So with this this whole idea of AI taking jobs, there's gonna be jobs that AI is not going to ever be able to take because it misses that humanity.
01:42:28
Speaker
I think lawyers is one of them.
01:42:32
Speaker
I don't know about that. You don't think, you don't think, uh, I don't know, man. If you feed enough cases into an AI, it can adjust. to can figure I mean, they learn for themselves now.
01:42:47
Speaker
I mean, not every AI is self-learning. Some are constrained to programming. There are self-learning, evolving AI. That was the ones that they like released on the net just to see what would happen, and they found each other.
01:43:00
Speaker
Yeah. and I said brioche. What kind of buns do you want for Burgers Later? Brioche. Thank you, Sue.
01:43:12
Speaker
but here's my problem with that. Who is in charge of creating the AI lawyer or the AI lawmaker, law writer, the AI, you know, a guy who writes software. letters yeah At this point we're we're having AI, white AI.
01:43:30
Speaker
Sure. Sure. My thing is this, right? A major, major corporation is behind this AI construction. It's going to be perverted.
01:43:41
Speaker
Because the corporations own the country. And they're going to make sure they're covered. John Q. Public is never considered by corporations. I know.
01:43:54
Speaker
John Q. Public almost never. We're considered to the point that we have resources they want to extract, which is money. We are resources, dude. That's right. Yeah, exactly. we're yeah we yeah we They extract labor and money from us all at the same time. It's fucking weird. Oh, my God. My head hurts.
01:44:13
Speaker
AI will never write hopefully never write law. yeah Hopefully. Hopefully. Hopefully, hopefully, hopefully. In your perfect world, there are no laws. Social constraints only. melin No, no, no.
01:44:30
Speaker
Not and entirely against all laws. And before I go, I do have one question for you. And I know it's off topic, so I'm sorry. um Are you doing anything tomorrow?
01:44:43
Speaker
i Dude, save it for after. I want you to stick around. Give us minute in the backstage. Arliss Walker, everybody. The great Arliss Walker. The man in the midst of motherfucking legend from Trumbac County, Ohio. Arliss Walker. Thank you, brother. Appreciate you, man.
01:44:57
Speaker
You be well. You be safe. If you're not safe, don't name it after me.

Closing Remarks and Reflection

01:45:02
Speaker
Oh, I didn't know you were dropping him. Oh, fuck. Yeah. Why are we dropped? What just happened? Because I put him on solo.
01:45:12
Speaker
I put him on solo. I put him on solo and then you then you removed him. So, yeah. Anyway, we're going to go ahead and close this puppy down. i want to thank everybody for joining us on our AI discussion. It was fun. Went through a lot of topics.
01:45:26
Speaker
Yeah. we ah As always. but we were we i you know We agreed on some stuff I didn't think we would agree on. But, fuck yeah. Once in a while, I can see intelligent thoughts. Maybe you're rubbing off on me.
01:45:38
Speaker
Or maybe you think they're intelligent bots. I'm rubbing off on you. I don't know.
01:45:45
Speaker
Josh, thanks, man. This is good conversation. I like this one lot. Hell yeah. Well, I want to say, i want so I wanted to end this by saying good morning, but it just hit noon here. So you're... Don't usually do that yet.
01:45:56
Speaker
Yeah, if you're west of me, good morning. You're east of me, afternoon.
01:46:02
Speaker
I'm 20 minutes from a ah time zone change. So literally...
01:46:09
Speaker
So if he said something that makes you want to kill him, he just gave you clue. He's 20 minutes of a time zone. Ooh, but which one? Keep dropping clues. Like, where's Carmen Sandiego?
01:46:21
Speaker
I think at this at this time, if you haven't realized where I live, you're just not paying attention. On that note, Michael, do you have anything else you'd like to add? AI, can we end the show? Are we allowed to end the show, AI? I
01:46:34
Speaker
do.
01:46:43
Speaker
Nonsensical network, different flavor every day. Movie talks, new flicks, hitting the display. Microphone magic, musicians spill the praise. From reptiles to motorsports, burning rubber craze. Football crashes, touchdowns, epic plays.
01:46:58
Speaker
News spinning, catching on the latest phase. Bleaming cars, engines throwing up the pace. Street tales, word and stories we embrace. tune
01:47:35
Speaker
but the vibe's just right Tune in, tune in, wait for that beat Flow so seamlessly
01:47:46
Speaker
always on repeat