Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
AI: It's not wrong, It's Hallucinating image

AI: It's not wrong, It's Hallucinating

The Noisecast
Avatar
1 Plays26 days ago

It's been a while and to get things back on track, I invited a long-time friend, Dre Bettis, to help me talk about robots, AI-written "newspapers", and Gen-AI Minions Memes - the reaction was worth it. 

Have a listen and as always, share if you loved it and subscribe if you didn't. 

Tell me I'm wrong:

Threads

Instagram

Twitter (I'm not calling it by another name)

Recommended
Transcript
00:00:04
thenoisecast
All right. Yes. Yes. And with that, um, you already know what it is. Uh, hello. Good morning. Good evening. Good night.

Alberto's Return and Podcasting Plans

00:00:14
thenoisecast
Uh, this is the noise cast and as always, it's your host, Alberto. Um, yeah, it's been a while since we did this whole thing. Um, just kind of life life then, you know, that's, that's kind of where we've been. Um, but you've had a good break. Uh, we're in a better place now. We're going to just.

Introducing Dre and Podcast Goals

00:00:34
thenoisecast
get back into the whole podcast thing. So with me tonight, have special guests. Got a good friend of mine. We go back to high school days.
00:00:46
thenoisecast
um My friend Dre here.
00:00:47
Dre
ah
00:00:49
thenoisecast
um And I brought him on for two reasons. One, um I didn't feel like talking to myself. And two, Dre is starting up his own podcast very soon and figured this is a good opportunity to get his feet wet and just kind of see how it all works. Dre, say what's up to everyone.
00:01:10
Dre
It was going on good people. Hope everyone's doing well and hope everyone stays well. Enjoy this listen.
00:01:18
thenoisecast
Yes, sir. Yes, sir.
00:01:20
Dre
I would.
00:01:20
thenoisecast
All right.
00:01:21
Dre
Did you end it up?
00:01:22
thenoisecast
Yeah, man. So, you know, we've been fighting with some, like, technical difficulties. um And I'll be honest with you, it's just like, it's it's like every time we we try to do this, oh, sorry, hold on one second.
00:01:41
thenoisecast
Yeah, it's like every time we've tried to do this podcast episode, something's always kind of happened and we've never been able to like connect. So like I'm glad that we were able to do this, even if this ends up being a very short um intro. So let's see what we could get into.

Privacy Concerns with Surveillance Technology

00:02:04
thenoisecast
oh We put together and some three stories that we want to just really like look at.
00:02:09
Dre
Okay.
00:02:11
thenoisecast
So the first one we're going to start with is a story that comes from a local um privacy group called STOP.
00:02:24
thenoisecast
um STOP stands for the Surveillance Technology Oversight Project.
00:02:29
Dre
Fun.
00:02:29
thenoisecast
And Yeah, yeah, yeah. So, you know, we live in a world where there's literally a camera in everyone's pocket. ah We're always recording a bunch of shit that we probably have no business recording. um And then we don't just record it, we share it literally to the entire world through the internet. And, you know, okay, you want to record yourself. That's one thing. But, um,
00:02:58
thenoisecast
There's a lot of gray area when it comes to the government and government agencies um using these types of technologies to what amounts to spying on the public.
00:03:12
Dre
yeah
00:03:12
thenoisecast
So that's kind of where STOP comes in. They're very much against the use of surveillance technology deployed ah to an unwitting public.
00:03:25
thenoisecast
So some of the things that they fought for in one were things like um preventing the metal detectors and X-ray machines at subway stations.
00:03:36
Dre
Mmhmm. Mmhmm.
00:03:40
thenoisecast
um Never mind how stupid of an idea that would that would have been, but they sued the city in one. um they've been suing the city ah against the use of drones for handling the 911 calls.
00:03:57
thenoisecast
And yeah, the same drones that you're thinking, the little ones that you fly, like...
00:04:01
Dre
Yeah, how would that work? So we have an emergency, you know, which one, you know, that's always going on in in the city.
00:04:11
thenoisecast
Yeah, look, look shot gunshots heard, right? Reported gunshots, right?
00:04:15
Dre
hu
00:04:16
thenoisecast
So instead of having the police show up, they're going to fly their little drone and record your neighborhood to find out what they're shooting going on.
00:04:26
Dre
Right, yes. I believe set ups in certain neighborhoods. I know I have one not too far from me where are the text I guess the specific sound.
00:04:38
thenoisecast
it
00:04:39
Dre
Post alert NYPD. um
00:04:43
thenoisecast
it
00:04:44
Dre
I've heard them. I'm not sure if you know anything beneficial comes from it.

Insurance Apps and Driving Behavior Monitoring

00:04:52
Dre
um
00:04:52
thenoisecast
So it's interesting you bring that up because the the the technology cost the city something like, no lie, like 200 to 400 million a year to run that specific thing.
00:05:11
Dre
Okay, don't change right Yeah Technology backfire and that is very surprising And as I was just
00:05:11
thenoisecast
And ah yeah, it shouldn't change, you know, like school lunch or something. um but They've realized that it doesn't actually do a good job of knowing the difference between a gunshot or, you know, like a truck engine backfire.
00:05:36
thenoisecast
yeah
00:05:41
Dre
You know, looking about looking into my ah my car insurance and, you know, GEICO has an app. um you know I don't want to shame or promote, ah especially companies that aren't paying us.
00:05:55
Dre
so So, but, you know, they have an app where, you know, they they detect and you have to you have to keep that app on 24 seven and they're literally monitoring you monitoring your movements, even if you're not driving.
00:05:56
thenoisecast
one
00:06:11
Dre
Now, this is beneficial or how they promote it is because if you know keep a good score, if you're if you're a good little driver, 10 and two, you know you can get a discount on your insurance, okay?
00:06:21
thenoisecast
Mm-hmm.
00:06:26
thenoisecast
Right.
00:06:27
Dre
Promote it.
00:06:29
Dre
But of course, companies don't don't stay afloat by giving discounts.
00:06:29
thenoisecast
Right, right.
00:06:35
Dre
so what is what is ah coming about and what is being you know expressed by members of GEICO and and insurance policy holders is that who have this app is that you know there might be an infraction that has nothing to do with them. And they're getting penalized because their phone, their the app is is picking up what sounds maybe like a crash or a speeding car, and hanging on these people's apps saying it's them.
00:06:35
thenoisecast
Right.
00:07:07
thenoisecast
Mm hmm.
00:07:11
Dre
Same thing, you know, going through stories, because this is literally yesterday, you know, insurance is sky high, and I'm like, how can I save? And sometimes certain things that sound too good to be true, it obviously it's too good to be true.
00:07:25
thenoisecast
Right.
00:07:26
Dre
What ends up happening, these people are fighting to, you know, fighting against these infraction claims by their insurance company that tells them to keep their ah app on 24-7, otherwise violation of the policy.
00:07:42
thenoisecast
Mm hmm.
00:07:45
Dre
um Now what happens is that if there's an infraction put on, even if it's a not your fault, say like, stop it short, because you might hit a car or a car is doing something ridiculous, um that gets pinged as well.
00:07:57
thenoisecast
Mm hmm.
00:08:04
Dre
So every time you get, you know, you get these infractions, there's a chance where insurance goes up. So what happens is a lot of people because same thing that you were just talking about, you know, the app is not able to differentiate what is your issue, what is the next driver's issue.
00:08:21
thenoisecast
Right.
00:08:22
Dre
And it's now, you know, it's becoming this big, big, big issue with with policy holders. They're trying to get out of it. And this figure found that if you try to cancel this, then you're breaking that contract because, you know, when you sign that, you know, agreement policy, you know, page 86 tells you no one reads past page one, if that, so.
00:08:30
thenoisecast
Oh yeah.
00:08:48
thenoisecast
No one reads the the thing. They just hit agree.
00:08:51
Dre
Yeah, scroll down X. Okay.
00:08:54
thenoisecast
Yeah.
00:08:55
Dre
Yeah. So, you know, this same day differentiating the sounds. Uh, but then I guess this topic is on, you know, technology and how cool it sounds, but, you know,
00:09:09
thenoisecast
Yeah, yeah. So, you know, so we we we pretty much understand that, um you know, technology it can be wrong no matter how much you know the makers or the proponents of that technology piece of technology may want to claim otherwise.
00:09:29
thenoisecast
So that kind of brings us back to the story.
00:09:29
Dre
it.

Facial Recognition in NYC Shelters

00:09:33
thenoisecast
So stop issue to press release a couple weeks back you know condemning ah New York City Mayor Adams from deploying a surveillance technology that uses facial recognition at city-run shelters, specifically the quote unquote um migrant shelters.
00:10:02
Dre
OK.
00:10:03
thenoisecast
So, you know, no one's asking them if they agree to having their face not just photographed but kept in a police database somewhere and being cross-referenced without their knowledge on any crime.
00:10:14
Dre
Mm hmm. Right. Right.
00:10:25
thenoisecast
So, you know, let's say for the sake of argument you have someone who is staying at a shelter in Brooklyn, but, you know, God forbid there's a rape in the Bronx. So they throw in, you know, they speak to the the victim, they get a description, and then they just start running the database trying to see matches. Now, to make things even more fucked up, if you're here on any kind of temporary visa or asylum visa,
00:11:03
thenoisecast
you cannot get picked up by the cops for anything, even if you're found not guilty.
00:11:11
Dre
Right.
00:11:13
thenoisecast
So it so it keeps this cycle of, well, you got picked up as a person of interest because someone said, um brown skin, five foot eight, you know, wears glasses, low haircut or no hair,
00:11:14
Dre
Right here.
00:11:33
Dre
Yeah.
00:11:34
thenoisecast
And oh, shit, that's like everyone in this um this this shelter right here.
00:11:40
Dre
yeah
00:11:41
thenoisecast
Why not? Let's pick one up. You know, because, you you know, not not to throw shade or whatever, because, you know, I have family that that works for for PD.
00:11:44
Dre
well
00:11:53
Dre
who
00:11:53
thenoisecast
And the truth is, they're they're not really doing any kind of investigative work. They're just trying to meet their arrest quota. That quote unquote doesn't exist.
00:12:03
Dre
bo sort of world Yeah, you know God bless all the identical twins out there cuz you know this type of surveillance when you have human error
00:12:06
thenoisecast
yeah
00:12:14
Dre
all the time, take people for you know just on a very bland and and basic description.
00:12:16
thenoisecast
Yeah.
00:12:23
Dre
ah So you know as we're seeing, you know leaving it up to technology and you know quote unquote advancement and you know what is deemed to you know keep keep us residents and citizens of this city and around the country safe, you know it's it's not doing that, all right?
00:12:45
Dre
But at the same time, also, you know you're putting potential innocent people you know on ah on a terrible path.
00:12:46
thenoisecast
No.
00:12:54
thenoisecast
For sure.
00:12:54
Dre
And you know it's just a...
00:12:54
thenoisecast
like
00:12:57
Dre
so easy way to you know ah put a blanket ah ah around a situation to you know ease either tensions or or you know worry.
00:13:10
thenoisecast
I mean, most of these times, these are political moves, right? So like, for example, this whole thing you know, started out as a bride. You know, the person who is running this program, ah hold on, got the name right here. Just one second. So, okay, so Remark Holdings is the Chinese fire and police surveillance firm that was given the bid to put in these cameras in the city-run shelters.
00:13:41
thenoisecast
Um, Timothy Pearson is the advisor in the Addams administration that pushed for this and even pressured the FDNY to force them to adopt the technology. So the FDNY didn't even want it. and You know, like the FDNY is who's charged with, um, helping to run the the safety at the city shelters. They were like, no, we don't, we don't need this.
00:14:07
thenoisecast
Like, we got people who are in the front who are checking IDs, who are making sure that people are where they're supposed to be.
00:14:07
Dre
Yeah. Yeah.
00:14:14
thenoisecast
But somebody was getting paid and they said, no, we're going to put these cameras in here anyway.
00:14:20
thenoisecast
So.
00:14:21
Dre
but We're at where we're at now.
00:14:23
thenoisecast
And we're at where we're at now. So you know so right now, like the the Surveillance Technology Oversight Project um is currently suing the city for that um and indicating the you know just how this amounts to ah discrimination and harassment on the part of the government.
00:14:27
Dre
so and
00:14:46
Dre
Absolutely.
00:14:47
thenoisecast
And yeah.
00:14:47
Dre
you know It's something you know that's been foretold. ah you know We'll get to that topic. ah And again, marketed as something that is amazing and and safe and you know dystopian.
00:15:04
Dre
I'm sorry, utopian. But it turned out to be you know something dystopian. But ah you know it's definitely you know there's violations that are happening.
00:15:16
Dre
all around us all the time. And it's being marketed as as as a technological advancement and something that's amazing and safe. And it really does nothing for anyone that line certain companies or individuals' pockets. But the overall scheme,
00:15:38
Dre
ah the the broad look of things is it you know it's just technology that technology is faulty and people in power certain people is in power that are taking advantage of that power.
00:15:52
thenoisecast
All right. Yeah, no, 100%. 100%. So kind of, let let's just jump into so into that, like this whole marketing of technologies meant to sound utopian, um but or anything but.
00:16:06
Dre
yeah
00:16:08
thenoisecast
um So I'm going to get right into it. Like AI, AI is the biggest thing um that's happened since we stopped fucking around with like cryptocurrency and realized that and money is pretend money.
00:16:24
Dre
yeah so
00:16:26
thenoisecast
yeah You got hit with the dope coin too, huh?
00:16:30
Dre
Oh, I got like 4 million. Nah.
00:16:35
Dre
I should hold on to it, right? Because I love so much.
00:16:36
thenoisecast
Yeah. So, you know.
00:16:38
Dre
I mean, only I got Zell and Iverson, but, you know, again, in 2024, you know, where we where we were told as kids that, you know, we have flying cars and and ah robots doing all of our jobs and, yeah, AI, you know,
00:16:54
thenoisecast
Yeah, so it's funny you mentioned that one too, but we're going to get to the robots in a minute, but first things first, AI. So this story comes from Axios of their AI newsletter, ironically
00:17:01
Dre
Yeah.
00:17:05
Dre
All

AI in Journalism: Mistakes and Oversight

00:17:06
thenoisecast
enough.
00:17:06
Dre
right.
00:17:06
thenoisecast
And so, you know, I sent this over to you and, you know, we we briefly spoke about it, but it sounds really weird because
00:17:18
thenoisecast
So there is an AI-powered newspaper, local newspaper, called Hoodlines, that, you know, the whole premise, the whole spin that you're giving us is that they don't need reporters, they don't need editors, they're fact-checkers, AI is just going to crowdsource all the news,
00:17:30
Dre
okay
00:17:40
Dre
Mmhmm.
00:17:48
thenoisecast
put it into an article because AI can write articles now and make these measures.
00:17:52
Dre
And make amazing pictures. Yeah.
00:17:55
thenoisecast
So, hold on one second, I'm going to get right to the article right now. But the long and short of it is, um did this local newspaper was writing about a crime that took place and they incorrectly said that it was DA who was arrested for the crime, except that's not what happened. Um, so, where is it? Yeah, so, oh, oh man.
00:18:39
thenoisecast
And I don't have,
00:18:43
Dre
So this is a This is the equivalent to ah you know copying someone's paper in school but accidentally putting their name on your paper, too and
00:18:53
thenoisecast
Oh man, it's even worse than that, because it's like, imagine if you copied the wrong paper from someone, turned it in, and your teacher said, yo, that's a great paper, here's a day for you. Like, didn't even bother looking at it, just said, hey, that looks like he put a lot of effort into it.
00:19:16
thenoisecast
So essentially what happened is Hoodline starts, Hoodline's news articles, I'm putting that with big earphones out there, starts out as tweets.
00:19:30
thenoisecast
So their AI goes to Twitter, which is such a reliable source of information.
00:19:39
Dre
Sorry, what's Twitter?
00:19:41
thenoisecast
Oh, X.
00:19:42
Dre
Oh, oh, okay, that, yeah.
00:19:44
thenoisecast
X, that thing. Yeah, so they go to X, they pull all the local news and they throw it together and turn it into a news article.
00:19:56
Dre
Oh, fun.
00:19:57
thenoisecast
A human being every second.
00:20:00
thenoisecast
so
00:20:00
Dre
okay
00:20:02
thenoisecast
they use So not only did they incorrectly report that the DA of this ah community was arrested for this crime, they then quoted the same DA's Twitter account saying that they arrested the person that was involved in this crime. So it's basically just fucking up on on every account because it doesn't know what to do with that information.
00:20:31
thenoisecast
so
00:20:32
Dre
So, you know, it's, you know, even though I haven't done use the AI on on the
00:20:33
thenoisecast
You know.
00:20:42
Dre
comp computere I sound old right now, on the computer systems or whatever, computer systems. and you know But like when you ask AI or whatever you call it, to to create a photo of such and such, and then they do their best, but then they add feet as hands and they they put two heads and stuff that, it almost looks good from a distance, but as you get closer, you're like, this is fucked up. This is not good at all.
00:21:08
thenoisecast
Yo, since you mentioned that, have you seen the um Christian Minions, A-I-R, that's a little more like Facebook?
00:21:08
Dre
ah
00:21:22
Dre
I have not seen it, but I need to look it up.
00:21:24
thenoisecast
Yo, Google that. Like, just put, like, Facebook Christian Minions.
00:21:32
Dre
I'm gonna do it right now.
00:21:34
thenoisecast
Yo, it's wild.
00:21:36
Dre
check
00:21:38
thenoisecast
And then there's a whole bunch of like, there's a whole thing also that's kind of like Facebook driven, but it's like, um, is it Facebook or was it, was it, it might've been Instagram, one of them, but basically there is a restaurant in Austin, Texas that was getting all these likes was getting, you know, like they got real famous for a croissant that was shaped like a hippo.
00:22:05
Dre
Hmm. Okay.
00:22:10
thenoisecast
Um, but the croissant never existed. The restaurant never existed. And today they're a website that is weird, but also sells t-shirts of the croissant hippo.
00:22:27
Dre
The croissant hippo of a restaurant that doesn't exist. Purpose of a restaurant is to eat. No one has eaten anything because it's not real.
00:22:37
thenoisecast
And you're in like millions of likes and follows and reshares and the whole bit.
00:22:44
Dre
Well, I'm on the the Christian Minions, is what she said.
00:22:48
thenoisecast
Yeah, yeah.
00:22:49
Dre
Yeah. Wow.
00:22:55
Dre
Whoa.
00:22:56
Dre
OK.
00:22:56
thenoisecast
yeah So as someone who is, I will describe myself as terminally online, like this shit just like pops up on my feet from time to time and it makes me do a double take and say like, what the fuck is going on?
00:23:11
thenoisecast
So to get a normal person's reaction is like amazing to me.
00:23:15
Dre
You know Different different view of minions Wow, okay You know what they you know stuff like it's crazy how it could just be done just like that and
00:23:21
thenoisecast
ah
00:23:31
Dre
you know But it's also one another aspect when you know photos are created. you know real Real things, but you'll see it in and news certain news stories now.
00:23:45
thenoisecast
Yeah.
00:23:47
Dre
I'm not going to get specific, but you know they'll add people to it. You know, with my eye, I could tell i pretty much tell what's real or not, but things that have been pushed out there and people run with it like it's legit.
00:23:56
thenoisecast
Yeah.
00:24:04
Dre
I'm not gonna say specific, but...
00:24:05
thenoisecast
Yeah, well, no, I mean, like, I'll tell you like one, one example that comes to mind is remember when everybody was sharing the video of Nancy Pelosi, supposedly she was drunk.
00:24:08
Dre
well Yeah.
00:24:16
thenoisecast
And then it was just like, Oh, that was just like made with AI. And everybody was like, Oh, but, but that one was like, you could tell that one was fake.
00:24:21
Dre
yeah
00:24:26
thenoisecast
And now, you know, there, I saw something about, and I haven't gotten the calls yet, but like. Be on the lookout. If you're registered to vote for any party, don't be surprised if you get a call from, you know, your, your party's candidate telling you to not go to vote on election day. And it sounds just like yeah the candidate, right? Because they're all, everybody's using these AI, deep fake things to like fuck with people essentially.
00:25:01
Dre
really, you know and no disrespect to any other generation but those things are really meant for like baby boomers to easily fold and
00:25:07
thenoisecast
Oh, yeah, yeah.
00:25:08
Dre
Really believe, be like you see in the emails, you'd be like, oh my God.
00:25:11
thenoisecast
It's like, yo, Kamala Harris wants to go and have coffee with me. It's just like, no.
00:25:15
Dre
President Obama is emailing me this one.
00:25:17
thenoisecast
It's just like, yeah, it's like, it's wild though. Like it really is wild because it's just like, we're we're in a place where these things are so easy to do, right?
00:25:28
thenoisecast
Like, you know, before. you'd have to like pay a really good actor to be this person. And even then, you know it was hard to like get people to say, oh, yo, Nancy Pelosi is out here drinking.
00:25:42
Dre
Yeah. Oh, yeah.
00:25:43
thenoisecast
and
00:25:44
Dre
It can do it could be so detrimental to you know a totally innocent person. ah On the flip side, ah no ditty, but it could help someone you know that is fighting a case say, that wasn't me.
00:26:00
Dre
I'd never, why would I do that? you know that a i And you need to prove that it was me, but I'm telling you that was never me.
00:26:02
thenoisecast
Right, right.
00:26:07
thenoisecast
Right. Yo, not for nothing. It's like, that's a good, that's a good argument now. Like, yo, that's a deep fake.
00:26:11
Dre
yeah Because stuff is so you know it's getting so real. And it's just like, wow. like you know What's going to be next year?
00:26:22
Dre
What's going to be 10 years from now?
00:26:24
thenoisecast
Yeah.
00:26:24
Dre
It came up on growing up in the 80s. And and you know I had Nintendo. And you know their points, are looking back now, I was like, was was our eyes fucked up?
00:26:37
Dre
Because we we oh, these are the best graphics ever. Look at this. I want y'all to look how real this looks. And, you know, when you see the stuff that's coming out now, it's just looking back at everything.
00:26:48
Dre
I can't even make out what these eight bit things are. But, you know, today, almost, you know, you look like a GTA six or, you know, one of the red.
00:26:52
thenoisecast
Yeah.
00:26:58
thenoisecast
I mean, yeah think about it. Like in a few years, that shit is even going to look like garbage to us because there's going to be some other wild shit that they're going to be like.
00:27:08
Dre
Things from even you know five years ago, I'd be like, oh, this looks dated. you know and But there's there's ah there's a limit.
00:27:13
thenoisecast
Yeah.
00:27:16
Dre
you know I don't want to play a game. um I'm jumping off course here, but it ties in.
00:27:21
thenoisecast
No, it's real good. It's a little bit.
00:27:23
Dre
Because then that's real life. you know you know As much as you know I ah have a ah PlayStation 5, some games I thought, I don't play it that much.
00:27:25
thenoisecast
yeah
00:27:34
thenoisecast
Yeah.
00:27:35
Dre
you know grown-up stuff but you know certain things but i when I play it it's so real like I have to like put it down sometimes a little too much you know but uh you know again back to as as the last story like as amazing as it can be as as great and advanced as technology could be whether it's like helping
00:27:48
thenoisecast
yeah i plan Yeah. Yeah. Yeah. Yeah.
00:28:03
Dre
you know sick people or whatever there's also you got that other side you know and damage and destruction
00:28:09
thenoisecast
For sure.
00:28:12
thenoisecast
Yeah, and you know and here's the other thing with um but that last story specifically.
00:28:20
thenoisecast
And we're gonna end it on this.
00:28:22
Dre
yeah
00:28:23
thenoisecast
My problem with um these technologies and the way that they're being used is in the kind of
00:28:34
thenoisecast
We don't give a shit. We're never going to own anything bad that comes out ah out of it. We're going to make some bullshit excuse. So in the case of this completely fabricated story of a DA being arrested for a crime that never happened.
00:28:54
thenoisecast
the you know the quote unquote newspaper said, well, it's not really their fault. The AI was having a hallucination. So it wasn't wrong. It was hallucinating.
00:29:08
Dre
Aw, man. That damn AI.
00:29:11
thenoisecast
So just think about that. like but The more we we sit here and we, you know, like, don't get me wrong. I'm like, I'm all for making life easier. um You know, if there's a repetitive task, I don't ever have to do again. I will never do it again. But, you know, when you talk about dealing with people in people's lives, like, I don't need a wrong computer telling me it's hallucinating. I need that shit fixed.
00:29:39
Dre
Yeah, absolutely.
00:29:41
thenoisecast
So.
00:29:42
Dre
you know ah when you When you put the power into technology, AI, whatever whatever it is controlling it, it's leaving.
00:29:54
Dre
There's no fault.
00:29:56
thenoisecast
Yeah.
00:29:56
Dre
um And at the same time, you can't can't fire. you know Oh, you didn't do your job.
00:30:01
thenoisecast
Yeah, you would you can't even argue it, right?
00:30:02
Dre
i
00:30:03
thenoisecast
like It's just like, oh, well, that's not even a person.
00:30:05
Dre
Yeah, you know so yeah, if it's a ah no fault situation, you know you you have a you have a ah non-entity doing the work where you don't have to pay, doing the work where this thing is not taking off or needs vacation or needs you know sick days or anything.
00:30:27
Dre
it can It can have hallucinations, but doesn't need to take off. So it's just, you know you cannot,
00:30:35
thenoisecast
yeah
00:30:36
Dre
You know, my one point bringing into a religious aspect, not religious, but spiritual aspect, you cannot try to make something perfected that wasn't created by the high of being, by the high of power.
00:30:52
thenoisecast
Yeah. Right.
00:30:54
Dre
So you can try to emulate, which you know I have no doubt is trying to be done with but the creation of robots and and all that stuff. But you know you're never going to get there. So there's always going to be you know issues.
00:31:10
thenoisecast
Mm hmm.
00:31:11
Dre
and it's so you know unless there's a, you know, looking back on who created who created this issue or who was responsible for, you know, looking over this AI news reporter or whatever, you know, unless that's put into effect, then you're gonna still have the same problems going forward and, you know, putting more people's innocent people's lives, you know, in jeopardy.
00:31:39
thenoisecast
Yeah, for sure. So you know we're we're going to kind of wrap up on on that point right there. um you know Just one one last thing.
00:31:51
thenoisecast
um I did share with you a video that was on on what you call it, Instagram of an optimist robot at a Tesla you know event.
00:32:03
Dre
Oh yeah.
00:32:05
thenoisecast
serving drinks so you know number one it turns out that they weren't really speaking to people like they were um it was actually a human somewhere hearing the conversation and then speaking through the robot so it's basically like they they took um you know the this bullshit robot put a ring doorbell on front of it and then had somebody on the other side like talking to people.
00:32:31
Dre
Mmhmm.
00:32:32
thenoisecast
So, you know, it was a bullshit um publicity stunt, which, you know, worked because we're all talking about it now. um But more importantly,
00:32:44
thenoisecast
You know, one of the things that was said about that bartender robot was, oh, um, I just had my drink poured out by a robot and they didn't ask me for a 25% tip.

Robots in Social Settings: Risks and Reality

00:32:59
thenoisecast
This is the future. And I'm like, yeah, you can pretty much go fuck yourself because like.
00:33:05
Dre
I, uh, seeing stuff like that and seeing people get excited and, you know, trying to, I guess, not make an excuse, but validate, you know, the, the atrocities of what's going on.
00:33:20
Dre
Like, I don't know, you know, I'm sure we you know can agree on this, but, you know, going out, going to a bar, going, whatever, you know, it's about, you know, social
00:33:31
thenoisecast
It's about people. It's about, like, talking to people.
00:33:34
Dre
you know it's not It's nothing like bullshitting with a bartender and you know whatever it is. I remember they would like I had you know some friends of us, we would go out.
00:33:46
Dre
One of my friends, he was very shy, you know, didn't know how to, it was his birthday, he didn't know how to, you know, pick up women or whatever. We were cool, you know? We just chopped it up, not all night, but a while.
00:34:00
Dre
Ended up getting a number, said it was for him, and it was just fun, you know? It was just like, all right, we still got it. But, you know, it was, you know, when you try to, again, try to, oh, we don't have to tip this person.
00:34:06
thenoisecast
Yeah.
00:34:13
Dre
People tip robots all the time.
00:34:15
Dre
You know, it's called grubhub. It's called, you know, what eight things that cor yeah yeah you're tipping.
00:34:15
thenoisecast
Yeah.
00:34:20
thenoisecast
Yeah.
00:34:23
Dre
um it's It's it shouldn't be about a monetary thing like yeah about a socializing.
00:34:29
thenoisecast
No.
00:34:32
Dre
ah yeah You're lacking that one on one human connection. And that's a big issue today. ah People, you know, you dress up.
00:34:43
Dre
dress up a robot to make it whether it's supposed to look pretty where it's supposed to look you know manly whatever type of robot inside of it is just wires okay sensors and showing the more sinister situation is someone behind there not literally but there's someone behind that robot controlling it taking notes
00:34:57
thenoisecast
Yeah.
00:35:10
thenoisecast
Oh, let me tell you, if i if I were to walk into a bar and see robots, number one, I'm walking out, but number two, if for some reason I did stay, I wasn't saying not a word the entire time I'm there.
00:35:22
Dre
Oh, yeah. And I have a real feeling that a lot of people that that enjoy this stuff will look forward to it.
00:35:24
thenoisecast
I'm like,
00:35:29
Dre
They've never seen iRobot.
00:35:31
thenoisecast
yeah
00:35:31
Dre
Back when Will Smith was cool, that iRobot, he was the one of the few that was like, no, no, no, having these in our house, having these in, you know,
00:35:36
thenoisecast
Again, no ditty. No ditty.
00:35:45
Dre
Having these live amongst us, it's not a good idea.
00:35:49
thenoisecast
Oh yeah. Yeah, yeah.
00:35:50
Dre
you know And it's just, you know to me, you know there's an infatuation you know of having stuff done for you.
00:36:02
thenoisecast
Mm-hmm.
00:36:03
Dre
you know Oh look, thiss this robot can make my drink, this robot can, the simplest task, oh robot, give me, you know, pull pull in my chair. Like, it's just, is is it's one of those things that might've been cool and maybe if you're discovering it in the 1960s, you know, but now it's like, we've, come on, we've we've seen animatronics, we've seen, you know, so much advancement, we don't need flying,
00:36:21
thenoisecast
hey
00:36:29
thenoisecast
I mean, we've been to Disney, right? Like, most most people at this point have been to Disney, right?
00:36:34
Dre
yeah
00:36:34
thenoisecast
And they've seen the whole thing. Like, we've we've seen them do the whole thing. Like, what you call it?
00:36:40
Dre
Yeah.
00:36:41
thenoisecast
Like, even even some cruise lines have, like, quote unquote, robot bartenders. Like, where it's just an arm. Like, I don't care if it's just an arm. But like, don't don't come in my face. Like, trying don't talk to me.
00:36:52
Dre
you know and ah These robots, who ah they were Tesla robots, right?
00:36:58
thenoisecast
Yeah.
00:36:59
Dre
Yeah. All right. So if we're going by the record of the you know the individual running Tesla and ah you know this probably is going to be the worst idea and invention since the Cybertruck.
00:37:12
Dre
I have no doubt. if this I don't know what the plan is as of right now, but I'm sure it's gearing up to, oh, have your own personal robot as as your opinion.
00:37:22
thenoisecast
Nah, I'm good. Nah, I'm good. I'm good. Listen, I'll be fine. I'll ah take my chances um being all decrepit and alone.
00:37:34
Dre
The idea of it, you know, is I'm going to try to get a little Tony Montana. It's not also, it's not also. Okay. Not all adults. All right.
00:37:46
Dre
People think they, what was that? What was the robot from the Jetsons?
00:37:50
thenoisecast
Rosie?
00:37:51
Dre
Ah, yeah. See? All right. Yeah. People thinking they got to get one of those. No, you're going to get something sinister.
00:37:56
thenoisecast
Now you're not getting Rosie, you're getting what you call it, like Hal.
00:37:56
Dre
Yeah. yeah um
00:38:01
thenoisecast
You remember from 2001 Space Odyssey, the little red dot that killed everybody?
00:38:05
Dre
I'm not gonna lie. No, I don't remember I know Yeah, and uh
00:38:09
thenoisecast
Yeah, so anyone who's watched 2001, the Space Odyssey, like knows like the Hal 9000, and it's just like, it it was meant to keep everyone alive and make life real easy for everybody on the space stage. And then it started killing everyone.
00:38:27
thenoisecast
Like that's what you gonna get. And you know what? It won't be wrong. It's not gonna be murder. It's a hallucination.
00:38:34
Dre
A hallucination, no charges, no, it's a robot. What are you gonna do? It's just, you know, but we'll make it better.
00:38:38
thenoisecast
Can't charge a robot.
00:38:40
Dre
Plus you you signed page 86 of the user that you can't sue or you gotta to read all through the olden pages and, you know, <unk> we'll see.
00:38:43
thenoisecast
Right.
00:38:52
Dre
It'll be interesting on on many levels, but, you know, let's look at the Cybertruck as an example of how horrible things could go.
00:38:55
thenoisecast
Yeah.
00:39:01
thenoisecast
Yeah, absolutely. All right.
00:39:04
Dre
That's the future.
00:39:05
thenoisecast
Yo, to the future. Dre, it was, it was a good time man chatting with you. Um, I appreciate it.
00:39:11
Dre
Appreciate it.
00:39:12
thenoisecast
Yeah, no, of course.
00:39:15
Dre
Looking forward to doing more things and looking forward to podcasting on a regular.
00:39:15
thenoisecast
So.
00:39:21
Dre
Thank you everyone for tuning in. Take it easy.
00:39:26
thenoisecast
Peace everyone.
00:39:27
Dre
All right, peace.