Introductions & Episode Theme
00:00:00
Speaker
Go ahead for it. You can do it. Ooh, my turn? Yeah. Good morning, good afternoon, and good evening, ladies and gentlemen, boys and girls, and everybody inside and outside the gender binary.
00:00:13
Speaker
My name. is Danny Guarantee, and I'm trying very hard not to rush through the intro so that I have time to think and don't stumble over my words and use filler words. I know, I'm doing a great job. Anyway, here's my co-host, Adam. Say hi, Adam.
00:00:29
Speaker
What's up, dawg? Sup, bruh? Have I got a very great episode for you today, sir? Okay, tell Yes, today we are discussing...
00:00:44
Speaker
A.I. bu bum, bum. What does A.I. stand for? Artificially intelligent. No, sorry, that's people that use it. Artificial intelligence.
00:00:56
Speaker
That's our government. It's artificially intelligent. And it's not even that. It's not doing a good job of that either. OK.
AI's Emotional Impact: A Reddit Case Study
00:01:03
Speaker
Yeah, we we kind of touched on A.I. in one of our previous episodes. True.
00:01:08
Speaker
But I just thought I'm using filler words. Fuck it. I thought it would be fun to to actually devote, you know, it's kind of the elephant in the room with everything going on right now. And I feel like we could talk about this without getting too politically charged.
00:01:26
Speaker
I hope it depends on what the question is. Depends what the question is. Well, all right. So the question is, and. This is more of just a vibe check question. It's not a real like what should I do kind of question. But how do you feel about this and what do you think could be done about this?
00:01:43
Speaker
Okay. So this is on the what do I do subreddit by complete path 8036. eighty thirty six I don't know what happened to the other 8035 paths, but you know, at least those are the incomplete paths.
00:02:00
Speaker
How many? least This one's complete. That's nice. hmm. And they say, my husband is using AI to text me. And it follows a couple screenshots. I'm not going to read the whole conversation. But to break it down, ah the OP, the original poster, is talking about how somebody at their work is either no-showing or not doing their job.
00:02:21
Speaker
And they're saying, like, I swear to God, this person really needs to get fired. They're messing everything up. Now, I have to skip my break to deal with it. And the husband is responding with what is very obviously AI.
00:02:35
Speaker
It's got M dashes in it. It's covered in M dashes, which I had a discussion with Mbeluga about, and we'll get into that later.
00:02:46
Speaker
But it is just. You could tell the OP is upset and really like needs to vent. And the fact that this is being responded to with AI is just so heartless looking on this m like um read receipt.
00:03:08
Speaker
Okay. It's just, you know, like they're complaining and it here's here's a snippet. Yeah, I get why you're pissed, M-Dash. That's a rough spot to be put in.
00:03:20
Speaker
Being left alone on one of the busiest days is stressful enough, but it's worse when it's something you've been avoiding and if she just communicated. And now it's not just today, M-. You're dealing with the ripple effect of her not finishing prep work too.
00:03:34
Speaker
That's a lot to carry on
Punctuation Preferences of AI
00:03:35
Speaker
your own. Like you could tell, right? Like it's just kind of that soulless AI. Yeah. And like, how would you feel if you were having like a really rough day or like,
00:03:50
Speaker
Somebody really hurt you today. And you were messaging me being like, Dan, oh my God, I need to talk about today. I just need to vent. This was this awful thing happened to me today. I swear I had to pull like a double at work because somebody didn't show up.
00:04:06
Speaker
And I answered you with like AI slop. Hey man, I heard that's really hard. And dash, you realize, I realize how hard this must be for you. And I just want you to know that You know, sometimes it's OK to be upset at other people and blah, blah, blah, blah.
00:04:23
Speaker
How would that make you feel? Yeah. yeah OK, before I get into that, i have to admit, I had a slight ADHD moment during what you were just saying, and i did.
00:04:40
Speaker
I did a little search because I was like, I like to think that in my curiosity in writing and ah recording and just trying to work on my own speaking and reading that I recognize my ability to use punctuation in a way that at least a lot of people in my circle don't.
00:04:58
Speaker
not you know not Not to say like I'm like in the minority, but at least in people that I know. use it in a much more formal way than most people do. Correct. Yeah. So but I realized like as we keep talking about M dashes, because that's like the thing that usually identifies unless somebody forgets to also delete the prompts in their AI conversation that you just see what it says at the end or the beginning, like write this prompt for this email to this person I hate, but make it sound good.
00:05:25
Speaker
You know, um I was like, I don't know what an M dash is for. Like, what? When do I use it? Because it's just not something I just feel like it's a dying thing unless you see it in a newspaper where they have to use it to break up certain things to make it fit in the columns or something. Right. So as far as I understand it, they're mostly used for.
00:05:47
Speaker
A quick side thought, like saying. um
00:05:54
Speaker
so-and-so bought a huge boat m-dash they always want to get the biggest boats m-dash uh it's green and red and has all this frill on it like it's it's just a quick side thought of like they bought this you know it's about this one boat side thought they always have to get the biggest boats kind of like you would think to yourself right like Oh, they got a new boat. They always got to get the biggest boat. I always. Oh, it's green and blue and red.
00:06:23
Speaker
I always use parentheses for that if I'm trying to indicate a secondary thought in between the thought. um So, yeah, so I was just. But I learned that there's also n dashes, which is what you put in between.
00:06:37
Speaker
It's the it's i can' apparently it's called an N dash when it's yeah depending on where used. So there's M dashes, and dashes and hyphens. A hyphen is where you say like. oh, that's a five-year-old kid, right? The five-year-old just squishes all together because you say to yeah as one unit, five-year-old, hyphen, hyphen.
00:06:56
Speaker
ah M dashes or N dashes are like when you tell us when you say, oh, I work nine to five and like the N dash replaces the word two. So just nine yeahp nine, N dash five, right?
TLDR Culture & AI's Role in Human Interaction
00:07:08
Speaker
And then, yeah, M dashes are the ones I'm like, Right. So I just learned today that like, oh, the what I hyphen, I thought a hyphen was the nine to five thing and the connecting of all the words. But apparently it is not that.
00:07:22
Speaker
um I guess hyphen is small and dash is slightly bigger. And yeah I'd never heard of that. I've never heard of an end dash. And I was like, oh, i I did not know that.
00:07:33
Speaker
Anyway, all that to say, like something that I get. This is a a tangent to the tangent. Yeah, that's fine. That's what the show is. Something i almost called it. I almost called it tangent to tangent. They're just going to be called TTT.
00:07:47
Speaker
they didn't want But I didn't want the SEO to like get mixed up with Trouble and Terror. Trouble and Terror is down. Yeah. So. ah so ah Shit. Now I forgot i was going to say. Oh, yeah. So apparently a lot of people are, this kind of answers the question a little bit as far as how to detect an AI answer, which is MDashes are almost always the thing.
00:08:11
Speaker
point like you see those and go no fucking human uses an m dash in a text message nobody right um but lately people have been also picking using the oxford comma to assume that a text message or an email is an m is an ai used one and that is where i draw the line which sucks that that is what people look for because i love the oxford i use the oxford comma oh my god so it's All right. Now this shit is creeping in on something that I use. which is I'm right there with you. I use i use that too.
00:08:47
Speaker
So annoying. I was going to bring this up later, but since we're on the topic, I'll bring it up now. I was talking to Umbeluga about this just last night. ah and About this question or about AI general? Not this question, but AI in general.
00:09:01
Speaker
okay And the topic of AI's use of em dashes came up as a ah alert, right? as ah Like a dog whistle that you're using AI. ah And she brought up a very good point of, I hate that M dashes have become synonymous with AI because they are a real tool that people use. Certainly.
00:09:22
Speaker
Right. And I get that argument and I was telling her, I'm like, yeah, I understand that, but it's a context problem. Like if I'm reading a book written by an established professional author who understands grammar and I see M dashes.
00:09:37
Speaker
Okay. You don't think you know what they're doing. But like in this question, like you were saying, if I'm texting Mbeluga and I start seeing M dashes in our text messages, I'm like, ah something's not right. This is not the, you know, like we said, you're more formal with your ah parent with with your punctuation. You're not this formal. This is like, no you know, English major, I'm actually levels of being a grammar Nazi, right? If you're like putting em dashes in your text messages and stuff correctly, those are for like, like again, like you said, a book, and what I was saying earlier, a column in a newspaper to explain something with extra thought. yeah There's, there's a time for it and it is not in your emails and your text messages. No, sorry. And that's, that's where I get the idea that like you can use em dashes as, you know, kind of ah an alert to that.
00:10:34
Speaker
Depending on where it is, if right, you know, if I'm reading a text from you and as M dash is, I'm going to be like, bro, come on now. I would like to hear your thoughts. So here's here's I have a question about the the the well, the question, the problem that we are answering, which is so am i understanding correctly that.
00:10:58
Speaker
Is there an AI that reads the text for you and responds for you? Or do you plug this into my ai is choice and then say, hey, can you come up with a good answer for me? Because that sounds like a lot more work.
00:11:11
Speaker
Yeah, my guess would be because ah I've never used an AI. Proud of it.
AI in Empathy & Healthcare
00:11:17
Speaker
Sorry, mom. um But like the way it reads on here, they must have.
00:11:25
Speaker
Gone into like chat GBT and. put in like, how do I console wife who's upset at work? Right. Which so like you read the messages at least enough to get an idea of what they're talking about.
00:11:39
Speaker
Just fucking answer. Right. I think like I think that's where I am confused about because this is it's the it's funny that you mentioned this question because I saw a few posts on Instagram recently that were targeting this in different conversations saying, holy shit, people are using AI to answer emotional questions now. And what I'm getting from this is If you have to, you're having a whole other conversation with a computer to have a better conversation with someone in your life as as opposed to learning how to be empathetic or to ask more questions. and
00:12:19
Speaker
Conversation and connection is becoming less human. And, and, and even if you take all that philosophical shit out, the amount of effort that people go through to not put in any effort is just outstanding to me.
00:12:34
Speaker
So you have to, because I used, I remember getting annoyed when I don't remember like when this first started, but even clicking a hyperlink, a lot of people are like, oh, if I'm in an app and I have to click something and it takes me to a website to buy something, I'm not interested. That's way too much work. I'm like, you're literally holding a phone in your hand and you have to move your thumb like one more time.
00:12:56
Speaker
What the fuck are you talking about? Yeah. Yeah. I don't mind that at all. Now, I don't like if it's like, oh, sign up on this other website and then this other one and then you can access the app like, OK, shit.
00:13:07
Speaker
But if it's just like, oh, if you want to buy our store, our store is on an external app. Go there. Like I've never cared about that. um So because people are just that lazy or like if you send them three sentences, I have seen I swear to God, people say I'm not reading that essay. I'm like, it's like two and a half to three. set well What the fuck are you talking like half a paragraph? What are you talking about? Three words at a time. Like what is wrong with you? I saw somebody put down on a Reddit thread saying that.
00:13:35
Speaker
One of the things that's kind of like ah undercover ruining everything in the world is the advent and proliferation of TLDR.
00:13:46
Speaker
Yes. And for those who don't know, TLDR means too long, didn't read. And what you would do is if you had back in the day, if you had a really long post online, you would put TLDR and just a brief summary of what the rest is about for people that are like, Oh, that's too long. I don't want to read it.
00:14:03
Speaker
They could just read the, the TLDR at the bottom. And I, I, it like flipped a switch in my mind. i was like, holy crap, they're absolutely right. It just lends itself to this whole hyperactive mindset of like, this is three sentences. I can't read that. What's the TLDR? That's only one sentence. Like, dude. Right.
00:14:23
Speaker
Yeah. it's Like, that's exactly it. Is people want a TLDR for something that is at some point or was the TLDR. Yeah. Oh, it's awful. The couple sentences that you're reading was the TLDR. It's like,
00:14:39
Speaker
It's like getting the Sparknotes version of a Sparknotes version of a book. Yeah, it's like that is so I'm so grateful that the people I have in my life, the majority of whom are either around my age or maybe six, seven years younger. I have a few people like that, but they're all educated to, you know, they like to read and they're curious and.
00:15:00
Speaker
They don't they're not just they care about what you say. And I've worked really hard to foster that. And so I forget that outside of that bubble I have sometimes and sometimes it just blasts me in the face when I'm like some of these screenshots I see on the Internet. I'm just like, this can't be real life. But like it happens so frequently that just that is what it is. And I think what really upsets me about it is.
00:15:26
Speaker
Like what I think TLDR should be, first of all, if we go all the way back to early internet, when that was the thing, when there's a slight tangent here, but it's related. And when I was learning to give presentations in school and I didn't quite understand what it was for then, but I really appreciate it now.
00:15:49
Speaker
And you if you pay attention to a lot of really well-made videos when they give you a teaser of what's to come, The introduction of a topic is supposed to tell you what you're going to experience. And so I always thought that TLDR should be at the top to say like, hey, here's what's being kind of not exactly what AI will do when you Google like or like when you get an email now, there's a Google AI that will say like, oh, here's what's been discussed in the email so far. That's not what I'm talking about. I'm talking like.
00:16:21
Speaker
okay, in the in the following rant, I'll be talking about why AI is bad, how it targets blah, blah, blah, and like the studied effects of blah, right? And then that way, as you read it, you can say, oh, I read these three these three bullet points. I know what I'm about to be in for and if I care to read it.
00:16:38
Speaker
Right. As opposed to a short summary that says, oh, I wasted all my time writing this, but you don't give a fuck. So he's the here's the TLDR at the bottom. Like it should just be at the top is like, here's what's discussed below. If you care to read it and learn more about what this bullet point means, it's down here.
00:16:55
Speaker
That's what that should be. But what people want now is. not even just a summary of what to expect. It's just, you know, people go to the YouTube comment section before they even watch the fucking video. Yeah, just to see what, because somebody will say, like, this is just what they did.
00:17:13
Speaker
Right. Like, but ah ah like I was saying, a well-made video at the beginning, if it's something that's like, a discussion or a piece of education or whatever should say like, oh, here's what we're discussing today.
00:17:26
Speaker
Like when you watch a ah late night show or if you watch ah whatever you say tonight on our show, we have so-and-so is going to be here later on this evening. We're going to be discussing ah the effects of bombing Iran and, you know, why too much AI makes you shit sideways. And then you wait for those segments or you try to find it somewhere in the video. Like, oh, i want to know why people are shitting sideways and you find it. Right. So like,
00:17:49
Speaker
That's what it should be is it gives you, it tees you up to let you know, like, hey, if you give a fuck, here's what we're talking about today. So I think we need to find a way to get back to that just as a side AI discussion.
00:18:01
Speaker
um But like, that's why I need to go on that tangent, because that's how I feel like the flow should be and was and should come back to be. um All that to say, Danny, I know we didn't answer this person's question yet, but um just I have thoughts and like just. Ah, fuck him.
00:18:19
Speaker
General. The AI says, ah, like yeah fuck fuck him.
00:18:31
Speaker
Yeah. But yeah, anyway, all that
AI's Impact on Relationships & Loneliness
00:18:33
Speaker
to say, though, to get back to that particular thing, it feels like that's more work because I know how somewhat how like these learning language models work.
00:18:42
Speaker
And you as like as you feed more information into it, it learns your conversation style. And so it'll just keep saying like, oh, my girlfriend had that same problem with work again. And the AI will remember what the problem was and say, OK, say this this time.
00:18:57
Speaker
So but still, you have to go to a third space and I guess technically a second space to get your answers. Like it feels like more work to do less work. do you know what I'm saying?
00:19:08
Speaker
That just seems like, even if you, even if we take out all of the other emotional aspect of it, which we can get into in a minute, just, just you do it because you think it saves you time, but you're taking more time to do it.
00:19:20
Speaker
Like it, cause it, cause you're, it's not like you're studying for a test and you're like, Hey, give me, write me ah an essay about this subject. You're, trying to respond in real time to someone who's having a crisis and you're like, hang on, let me do, let me plug some data into a computer to get a human response. It's fucking insane. Just the amount of work you're doing just sound like a human being.
00:19:47
Speaker
Oh God. It's so weird. Anyway, I've been talking for a hot minute. Like, let me know. It's all good. so So the emotional aspect of it all, i I would be so fucking pissed. First of all,
00:20:00
Speaker
It's so incredibly insensitive and insulting. If I were talking to you about something that was bothering me and I just got a chat GPT vomit in the response, ah we wouldn't be friends anymore.
00:20:18
Speaker
i would probably just ghost you. be like Fuck this guy. Yeah. Yeah. It's so incredibly insulting. It's almost as bad as just getting a K after you like pour your heart out to somebody and they go, K. Yeah, mom.
00:20:35
Speaker
K period. OK, period. Yeah, which is worse, apparently to you.
00:20:41
Speaker
You'll have to go way back in the archives to hear that that train wreck.
00:20:48
Speaker
It still tickles me. It's so funny. Oh, I'm sure. but See, it must be a New Jersey thing because Mbaluga agreed with me immediately. I'm sure a lot of people do. i I don't. You know what? We don't have to discuss punctuation and text messages right now. All I'll say with the punctuation thing, because we talked about the Oxford comma before. Yeah. I remember when I was younger, i used the Oxford comma on something and my parents got in a huge fight.
00:21:17
Speaker
Because my mom, if I remember right, my mom wanted it there. My dad didn't. And they got into this huge fight. And I remember looking it up and being like, yeah, it says you could or couldn't. Like, neither is wrong.
00:21:31
Speaker
ah con I just remember it was one of the biggest fights I saw my parents have in a while. I don't know if they were just tired. It was the end of the day. You know, i was doing homework at night. They were helping me after working.
00:21:43
Speaker
And so I don't know if there were short fuses or whatever, but holy crap, all of this drama over freaking comma. Yeah, that's crazy. Yeah, that's very passionate. Oh, my God.
00:21:56
Speaker
So, OK. Yeah. So let's get into like. So I do have other things here that we can talk about. you mean? All AI related.
00:22:08
Speaker
Well, I'm specific, like, I want to make sure we address this question a little bit further because I think.
00:22:17
Speaker
I don't know, it sounds like this is something that was really bothering this person. I'm trying to see if I can find empathy for. the person who answered.
00:22:28
Speaker
It does say at the bottom, I've already yelled at him for it since it happened a few days ago, but he said I was overreacting. but Well, see, well, that's OK. Well, then it's hard for me to have empathy for this person. yeah That makes it 100 times worse now that I'm trying to think if there is what is the utility here? OK, is there can we think of a situation because I'm trying to be better about.
00:22:49
Speaker
Is there a situation where this is just. What you would do, right? So. I understand that there are people who. like have a hard time emotionally connected.
00:23:04
Speaker
They don't know how to respond in a crisis situation. They may not know how to respond to someone crying, ah whether it's because they're neurodivergent or maybe it's just because they are not as emotionally connected grown themselves to really respond in a way that feels helpful, or maybe it's just ah an emotional paralysis where they're like, I want to help, but I don't even know where to start.
00:23:27
Speaker
I understand that that is a situation that a lot of people find themselves in. and They're like, I don't know how to help you. Right. But did it say in the post, if this was a partner, can't remember.
00:23:38
Speaker
Yeah. It's husband. Oh, Jesus Christ. Husband is like a whole other ball game where it's like, you married this person, you married each other. Like what would you said through sickness and in health or whatever it is they make you say nowadays. So like,
00:23:53
Speaker
this is one of those times where you need to support your partner. And if you fell in love with them, I hope that's why you got married was because you love each other. i married you. I didn't marry chat GPT.
00:24:05
Speaker
Right. So I'm trying to think what, there could be a series of things that could maybe lead to this person being like, um my God, my wife has so many problems at work lately and nothing I say seems to work. I'm really trying to empathize with this person just to try to see where they're coming from.
00:24:20
Speaker
I know there are some people that are 100% AI is bad. It needs to go away. I actually don't feel that way. I think AI could absolutely find a place in society as a helping tool, not a replacement tool.
00:24:36
Speaker
So in this instance, I would be just fine if the husband had said, Hey, chat GPT, what can I say to my wife? Who's had, you know, here's like the problem.
00:24:49
Speaker
What can I say? And it gives you and ah template to work on. And then you use that as a starting point to put your own thoughts into. Sure.
00:25:00
Speaker
you know, like it might say, tell her you're sorry she's having a rough day. And then you say, hey, baby, I'm really sorry you're having a rough time. You know, like I really wish there was something i could do to help. Maybe we can go out for dinner tonight to one of your favorite places.
00:25:15
Speaker
Like, yeah, chat GPT gave you like a springboard to start from, but then the rest of it is all you. Right. That's when these people say, just write it for me and put it on the thing. That's a problem.
00:25:27
Speaker
That's where i mean, with anything, when it whether it comes to art, when it comes to writing, whether it comes to writing. Whatever challenges you're looking for, you're trying to remove your own human element from it. And that's the problem. I have used AI at work to generate a better way to say something where I've plugged in. I'm like, here's kind of what I'm trying to say, but like it doesn't quite feel I need to make it a little bit more succinct.
00:25:51
Speaker
And it'll say that. And then like I'll rewrite it to kind of like, oh, I see the order you're using that in. I understand that's a nicer way to put it, but like it doesn't quite feel my words. And then I replace it with my words that kind of hit.
00:26:02
Speaker
So I've already supplied. the, the, the fodder by saying like, Hey, here's what I've got so far, but i'm missing something. And then I'll pitch it to other humans and say like, Hey, okay, here's what I've got. I've done that a couple of times just to kind of see if I could just shrink a thought.
00:26:18
Speaker
Hot take, but I could even see it being used in art as just like a way to get ideas. Yeah. Like if I can't even imagine, you know, if I wanted to draw a picture of a dragon turning into a banana halfway through,
00:26:31
Speaker
It's like, I can't even really imagine that. Well, let me put in chat GP or or whatever, Sora, whatever it is. A banagin. Which we will talk about later. um Just to get a picture of a dragon turning into a banana so I could be like, OK, maybe I want it to change a little bit higher up and I'll draw it that way. But like it at least gives my eyes something to work on, you know, to work off of.
00:26:54
Speaker
Right. that's Exactly. right It's still your art. It's just helping. It's not a launching point. I've used it for writing like things for TTRPGs where I'm just like stuck. I've hit a fucking wall. Just like I need something and I can't ask the usual people because they're part of the story and I don't want them to know. And so I said like, oh, like I just need like, can you give me like two bullet points of like, hey, what is this?
00:27:17
Speaker
is Is there something I can squeeze in here? And a lot of times I don't even want to use what the AI said, but it's more of a, oh, okay. Well, I know I don't like that. Like, okay, I don't want to go in that direction.
00:27:28
Speaker
um But it's just when, like, if you find yourself just copying and pasting exactly what it said, it's almost, it's just like plagiarism at this point. Like there's nothing about it that came from you, but let's go to this question. And they, you know, maybe the wife, we don't know. ah wife One more thing real quick before we start.
00:27:47
Speaker
Yeah. ah Just a big you know disclaimer and clarify clarifying for everybody. Just because we are not 100% against AI does not mean we are not 100% against the way AI has been trained. Fuck all of you for stealing everybody's written works, art, and everything else and using it to train your AI without permissions.
00:28:07
Speaker
You're an asshole. Oh, yeah. theres a like I will say... As a tangent to that, when you said earlier, it should be used to help. There are people, there are scientists and doctors who are currently using AI models through certain types of injected medication that like hunt down and destroy cancer cells. And the AI helps them. The AI is trained inside of the blood of a human to detect cancer cells and remove them like that.
00:28:35
Speaker
It's fucking incredible. And that is exactly what AI should be used for. that That's a great application. That's something humans could never do. Yeah. Hunt down cancer autonomously inside the bloodstream of a human and eliminate it before it starts. What a fucking magnificent way to train a robot.
00:28:53
Speaker
There's actually been. Even with the language models, the large language models we have right now, there are some applications that are showing up that are OK. I saw.
00:29:06
Speaker
ah somebody, big grain of salt, because I haven't actually looked this one up yet, but somebody was saying it it's had good effects with therapy, not giving advice in therapy, but just acting as somebody to listen. Because that's literally all it does. It listens to what you say and spits something back.
00:29:25
Speaker
So as long as you're not using it to give you advice on what to do or how to cure yourself by taking ivermectin or some shit, but you're just using it as like, oh man, you know, I'm having like a panic attack. I just need to talk to somebody.
00:29:39
Speaker
It works as just that blank springboard to help you calm down. Yeah. If it was, if, if it was trained in the way that isn't, like you said in a previous episode as a yes man, right? It's not just saying like, Oh, everything you say is definitely right. Like, Hey, you know, I was thinking that maybe because there's a spider in my house, I should set a fire in the room. Oh yeah. That's a great idea because that'll really smoke the spider out. Like, you know what I'm saying? Like that's not because that's what it does. It tells you what you think you want to hear at one point because it's,
00:30:09
Speaker
thinks that you're wants you to stay on it. Right. the end the day Because the model is trained to do that, just like YouTube is trained to keep you on the site at this. It's the same way every social media works. They want you to stay on it as much as possible.
00:30:23
Speaker
Right. And so that's where it falls into a trap. However, in this particular situation, if if maybe a way that you could have used the AI in a better method would be, you know, like if.
00:30:37
Speaker
Say this has been an ongoing challenge for your wife at work and you just feel like you said everything you know how to say and you're at a block and that the the AI knows this because you've maybe started saying like, this is an ongoing problem. She said this a few times now.
00:30:52
Speaker
Here's what I've said so far. I'm really struggling to say the right thing or to ask the right questions. Can you help me come up with other ways to approach this? And maybe instead of it saying, say all of these things, it can say, have you tried asking about...
00:31:06
Speaker
you know, ask her saying, is this a thing that you just want me to listen right now? Or do are you looking for solutions? Because sometimes someone, just like you said, just wants you to shut up and just hear them and know that they're being heard.
00:31:18
Speaker
Other times they're saying, I don't know what to do. i need help. And you say, and and instead of pitching out ideas, you say, well, what have you tried so far? An AI can also maybe learn, okay, here now you know, because we've been talking this whole time, AI,
00:31:33
Speaker
You've heard this information. Here's what we've tried. So like, don't propose these things because I've already done it. You have other ideas. Again, i think that in this situation, again, really trying to lean into, can I empathize with the person who just said, here's a copy paste?
00:31:49
Speaker
Not the way to do it, homie. Definitely don't do it that way. But if you're if you' are in a situation where maybe either you're not good at emotional responses or you feel like you've tried everything or if you're just at a loss for how to get started, it's okay to generate ideas. But if you're just letting it do the work for you and you're not – like if it said, well, has the has your wife decided to talk to her boss before? And you can say, okay, let me ask. And you say, babe – what What did that conversation look like with your boss? Can you tell me more about what that was like?
00:32:24
Speaker
okay, now you're having a conversation with your wife. But if you're just like, ah here's a copy paste blank sheet of proposed ideas, like, okay, like none of that is your thought. None of that is your heart.
00:32:35
Speaker
There's no care that went into it except for, well, let me let the robot do it. Especially, and this is the point I'm trying to get to with this, is if your partner or your friend or whomever it is that is in this situation where you're just responding with with AI responses, you know,
00:32:53
Speaker
If they find out to that. You were able to give more emotional responses to other people, but not them. Especially if they're like your wife, like especially if that's your wife or your partner, like that is such a kick in the gut. And I remember, you know, as ah people on the show probably know, because I've posted an episode about it.
00:33:18
Speaker
You know, I had a really bad falling out with a longtime friend of mine and a lot of it came from I'm just emotionally burnout. I'm not I'm not good at this stuff. I'm not good at, you know, if if I say, hey, man, I just had a really rough day today. Can you just like talk to me for a little bit? Oh, Adam, you know, I'm not good at that stuff. then I actually was thinking that, but I didn't want to bring it up. And then I see scores of text messages from like a woman you're seeing and you can just all day. Oh, man. Oh, man. I had such hard day at work. Well, tell me about it, babe.
00:33:49
Speaker
You couldn't say tell me about it, babe, to me. You know, like, like, it's just like if if somebody finds out that you're AIing them, but not everyone else of what you shouldn't AI everyone else either. But if they find out that you're putting forth more effort to communicate with others and not them, especially someone you've married, someone who you pledged to do that for, like, that's heartbreaking, dude. That's really hard.
00:34:13
Speaker
And it's already heartbreaking to find out, even though you said in sickness and in health and we got married and you said you'd care for me and you can't even bother to muster that up. Like, that's hard. And I'm really trying to feel for the person you thought like, oh, i'm I'm doing this to help you.
00:34:27
Speaker
But like, I don't know, man, that's your that's your partner. You married that person. That's. I got if I had to pick between the two. And you just copy and paste it like, I don't know y'all's relationship, but either way, you got to bring the human element. You can't just take the human element out of it because. Yeah, it's that's why everybody's so goddamn lonely is because there's just no.
00:34:48
Speaker
ah Yeah, don't you love that we're in the middle of a loneliness epidemic and the cure is more loneliness? Yeah, exactly. That's a ah a great... Now we're not even even talking to real people anymore. Fucking fantastic.
00:35:01
Speaker
Right. Can you say that again? What now we're not even talking to real people anymore? No. yeah What did you say? You said the the cure for the loneliness epidemic is more loneliness. Yeah, that's that's exactly it on the head, dude. Yeah.
00:35:16
Speaker
The cure for loneliness is more loneliness. It's like, well, it's kind of like, um you know, uh, underflow in computer terms, right? Like if you go under the lowest number or higher than the highest number, it wraps back around to the lowest number in like computer speak.
00:35:36
Speaker
It's that if you get so lonely, it wraps back around to, I have all the friends I could ever need. I'm just kidding. That's fucking psychosis. Yeah, that yeah is what you're gradually sifting towards something more insane. um Yeah, like at some point, if the house catches on fire and I'm in it because I was cold, you know, the problem is if I'm we haven't not figured out a way to monetize having real friends yet.
00:36:04
Speaker
So instead of you having real friends, we're going to solve the loneliness epidemic by giving you fake friends that we can charge you to use. Yeah, i remember. I don't remember what episode it was. It was either an episode of Game Changer or.
00:36:19
Speaker
Make some noise on Dropout. And I remember Brennan Lee Mulligan mentions, he said, if compassion could be gamified, our world would be a much better place. Yes, that was a game changer. We just watched that one the other day.
00:36:32
Speaker
Which one was that shit? I can't remember what he was. I don't remember exactly. I think it was the Olympics one. yeah Oh, we just we just watched that one. And I remember that line. Yeah. he said The Olympics one.
00:36:45
Speaker
Yeah. If compassion could be gamified, the world would be a much better place. And like, man, I think about that a lot. because It's so true. It really feels like if there is a way to gamify it and monetize it, all of a sudden like, oh, hey, you make five bucks if you go and hug 10 people today with their consent, you know, and they sign off on it that they did it. like If there was just some way, don't do that. But if there was some way to.
00:37:07
Speaker
Just incentivize people like, hey, you get more and benefits in this game. You get more skins if you. Yeah, okay that's why I'm nice. That's why I'm not overly critical of the people that do what Mr. Beast pretended to do.
00:37:24
Speaker
um Sorry, allegedly. ah Where like, hey, I gave a thousand dollars or ten thousand dollars to a random homeless person and they make millions doing it.
00:37:36
Speaker
I don't really care. You know what? At least you're doing something good. I don't really agree with the reason why, but at least at the end of the day, something good came out of it. Right. It's yeah better than nothing. Right. And if we could learn to, you know, it's better you get rewarded for giving a homeless guy $10,000. Then you get rewarded for walking down the street, bumping into people and having your henchmen.
00:38:02
Speaker
Intimidate them so that they can't be like, yo, why did you hit me? Right. It's yeah, there's there's definitely a ah model to be made that.
00:38:14
Speaker
I don't know, it just feels like for the same reason that social media, you get more engagement from rage baiting people than you do for compassion videos and kindness videos. And why people say, oh, it's so wholesome. I wish there was more stuff like this. And it's like, well, there is. But the app pushes more of the other stuff. It's not going to show you that because you don't watch that for as long as you do.
00:38:36
Speaker
other shit, right? Because it makes you feel something. And so maybe you don't even finish it because like all this one always makes me cry. you know, so, yeah, I think it is. It is very shocking to me that as especially as a as a people person, as somebody who craves that connection.
00:38:54
Speaker
And I was just explaining that to who was I just talking to about? Oh, I was ah out with a new friend I made and she was just like, so extrovert. What the fuck's that like? What's that about? Like, is it You know, and very wonderful personality, this, this person, but like, they are just so vexed by, not, I shouldn't say vexed. I keep using that word wrong. They're just so like spellstruck by this idea of, so you like, like people. But I had to explain, cause we were in a crowded ah restaurant bar at the time. And i was like, this isn't my ideal setting. It's not that there's a lot of people around.
Social Media, AI, and Post-Pandemic Society
00:39:31
Speaker
energized when I am, connecting with other people when I am hearing about how they came to be where they are or who they are or what they did this morning that was exciting or a challenge that they overcame or whatever that may be just hearing about something that they love.
00:39:47
Speaker
That is like fucking turn so many lights on in my brain that just like I was just at a housewarming party last night with my roommate and her and their friend and some and and probably like 15 other people that we hadn't met before.
00:40:02
Speaker
And I came home really late. We drove separately because I knew I was going to want to stay late. I always do. And I was the last one to leave. Oh, that's why we didn't record yesterday.
00:40:13
Speaker
Yes. one Wow. Wow.
00:40:18
Speaker
I thought I'd be back sooner, but just I couldn't record yesterday because you were too, I'm too much fun with your other friends. I was just excited to learn. i mean, there were definitely situations where I wasn't happy to be around for certain parts because like, you know, it's just, there were small rooms and, a couple of people had pre-gamed to come to the party. And of course, they don't recognize how loud they are. But I got to engage with a lot of really interesting people. And again, new people. i'm like oh my God, new people who want to talk to me. Holy shit.
00:40:44
Speaker
And that's exactly it. like it's It's not that there's a lot of people around. It's just that there's opportunity for connection. And that's what I think more people probably crave, but don't want to admit it.
00:40:55
Speaker
Or they admit it, but they don't want to make the effort to do it. The only difference between me and everybody else is that it energizes me on a bigger way. You know, like as opposed to it it would still energize you, you know, as an introverted person. It's just that it might tie you out after doing it for an extended period of time. but You're like, I made a new friend. I'm excited that they like this thing. Oh, what's that? That guy likes Super Mario frustration. I should talk to that guy.
00:41:18
Speaker
Right. Oh, God. who Who would meet over such a stupid thing? We craved similarities. We crave knowing other people think how we think. or we but But the difference is I crave also knowing how other people think, even if it's not how I think. and like And I think that's a missing element. And I believe this, and I will go to my grave thinking this.
00:41:38
Speaker
I wish the world were like 5% more like me. I genuinely believe this from the bottom of my heart. I wish the world- We're 5% more like me and how I crave connection and how I engage because I think we would have much happier society. Five of you is like 5% of you still a lot, maybe three.
00:41:59
Speaker
yeah I'm fine with three, maybe three. Yeah, five is a lot. See, five percent not a lot in itself, but five percent of you. It's a s lot. Well, not five percent of maybe not five percent of all of me, but five percent of that.
00:42:13
Speaker
um like That willpower to approach or to ask questions or and that kind of leads into a thought I had before ah because we were talking about how.
00:42:26
Speaker
a lot of. these algorithms will push rage content because, you know, you, you interact more with things that annoy you. Right. And it occurred to me the other day that this is just weaponizing what little bit of humanity is left in the world, because the reason that you want to talk about something so much more when it's bad or like leave a review when it's bad I think is deep down you want to warn other people.
00:43:00
Speaker
Hey, I had a bad experience. Don't go to this because it makes you feel good to help. It makes you feel good to be like, you know, this berry was bad. Don't eat the poison berry. Eat the good berry way back in the day. And nowadays it's this service scammed me. Don't use them.
00:43:19
Speaker
And that's why it's vindictive, but sometimes it's been I always I feel like it honestly comes mostly from a place of wanting to help. But it's been so twisted and monetized and like just.
00:43:36
Speaker
Disgustingly ripped open and everything, it's just unrecognizable at this point. And man, when I realized that. The wave of dread was just like, man.
00:43:48
Speaker
like the one speck of humanity left in people is getting behind something you hate. And they learned how to monetize that as well. I would like to make a slight edit to something you keep saying, though.
00:44:01
Speaker
um Respectfully, I think. i don't like the idea of saying what's left, you know, I think there's more to that thought, which is.
00:44:14
Speaker
what people are willing to show are quicker to show because there's a lot of humanity left in people, but they, they They seek to feel it more than show it, I think.
00:44:30
Speaker
And what they do show is usually it's easier when it's negative or when it's inflamed in some way. I could i could get behind that. you know what I'm saying? like I think there's plenty of people who, you know, if we there was a group of us just on the street and we saw somebody pinned under a car,
00:44:49
Speaker
that we would all try to go and move that car. Right. I think there is just, but a lot of times they need someone else to lead by example to show that it's okay. Or that,
00:45:01
Speaker
you know like A lot of people will step over a person sleeping on the street or unconscious on the street, but the moment someone starts to help that person up, others will feel like, oh, it's okay to engage. and so Some of that comes from that fear of interaction or overstepping or someone else will take care of it, bystander syndrome, whatever have you.
00:45:20
Speaker
but I mean, look, we've record turnouts at no kings protests. A lot of that comes from a place of anger, like I'm tired of where we are. But a lot of it also comes from just like, hey, we want to do some good. Like you were saying, we want to help. We want people to know that we're tired of it. And the more they see this, the more we're using our voice. This comes from a place of empowerment. Yeah, we're fed up, but it's also we're we're trying to make a message because we want our world to be better. trying to live we're trying we're the We're tired of living in a place or in a society that punishes you.
00:45:50
Speaker
For caring. Right. And so you actively look down on charity. Oh, I don't want to, you know, I don't like socialism. That's giving free money to people. So the fuck what?
00:46:01
Speaker
Yeah. Heaven fucking forbid people can live a happy life. Yeah. And so Jesus Christ, that's exactly it. So like, I would just want to. Sorry. That political.
00:46:12
Speaker
No, it's it's not. I mean, it's political and not like that's what the protests are for. But like I think it's so just ah it's it's a movement to say like, hey, like there are a lot of us who are going to come out here in freezing temperatures or otherwise to say we want a better life. This is what we're doing. This is how.
00:46:28
Speaker
You know, we know that the people in charge don't like that we are doing this and they don't want us to be heard. And thus, that's why we are here. And it is to let other people know that you can have this. You can be a part of this. You know, there's a safe space here. And so all of that to say, like, I just want to make sure that like I don't, you know, to protect your heart to Danny, I don't want you to fall into a space where you look at it as this is what's left.
00:46:55
Speaker
Because no, and I didn't really mean it that way. I think that was more of a lack of ah better phrasing on my part. I know there is more humanity left than that.
00:47:07
Speaker
I can get rather cynical, but I do know deep down that there is more than that. sure But I do agree. It was kind of the wrong way to put it. But um I guess, yeah, like you said, it's one of the last remnants of ways the normal person will show.
00:47:23
Speaker
some kind of humanity in that they want to help others. ah Because there's a lot of people in this world that have the fuck you got mine mentality at this point and like.
00:47:35
Speaker
Or or it can get pretty dark. I think what what kind of happens is and kind of bringing it not exactly back to AI, but maybe technology in general is.
00:47:47
Speaker
I think something broke during the pandemic more than I think we realize. And I think loneliness really like prevailed. And I think there was a short period where it was, I miss my family. i miss my friends. I want to have Thanksgiving, all those things.
00:48:06
Speaker
My flow has been disrupted, which is where a lot of that anger came from. But I think it's also, it then it turned into my privileges are being taken. What do you mean I can't go to my favorite bar?
00:48:19
Speaker
would And a lot of people who didn't even go out were angry. that They still want the option. They just don't want to do it. And so I think there has been, i thought for sure when things opened back up, there would be like a resurgence of you know humanity wanting to connect in a way that maybe they didn't realize they were taken for granted. But I think a lot of people, because it lasted for quite some time, adapted. I mean, I adapted a little bit to an online sphere that I ah i i never cared to do. i just was like, that's not who I am.
00:48:50
Speaker
And I found value in it, but it's not all that I crave. It doesn't fulfill me the way that it does for other people. right Well, you know why I think that is? Why is that? I think what happened from the pandemic was People were forced to stay, you know, it's isolated. Right. And we went through all the stages of grief for that. People got angry about it. They got sad about it. They started um bargaining over it. Right.
00:49:19
Speaker
And then we hit acceptance where everybody was like, hey, wait a minute. I have time to make bread, start a podcast, play D&D virtually. I have like time to do stuff.
00:49:33
Speaker
And then stuff started opening back up, which I think is fine. And I think people would have taken to it much easier the way you wanted it to, if it was left at that.
00:49:46
Speaker
But we started getting people pushing people back out into the world. Um, the biggest offender for this for me was return to office, right? I had to return to the office in my last job before I felt comfortable. And they were just like, oh, just wear a mask. You'll be fine. i'm like, fuck that.
00:50:05
Speaker
It's not up to you to tell me when I feel comfortable with something. And that really ruined the whole idea of going back out, right? Like, I think if people had listened when this first happened and we spent two weeks locked up,
00:50:20
Speaker
the disease came and went and died because nobody else got it because everybody isolated for two weeks. It would have been a lot more like you're saying where, You know, everything opens back up. We're like, OK, you know, we all had like a two week vacation. We feel better. We
Cognitive Impacts of AI Reliance
00:50:36
Speaker
realize that we actually want to see our friends and everything.
00:50:39
Speaker
But this became a long drawn out piece of bullshit where people weren't listening. So the disease kept coming back around. And now you're being pushed to ah be around other people while this, you know, it was only just starting to become flu like and not deadly when they started telling people to return to work.
00:50:57
Speaker
So it's like, well, great. So you're just putting my life on the line for your profits. Great. Thanks, guys. And the yeah, and it was on and it soured the whole thing. Like, why would I want to go out with that?
00:51:09
Speaker
Right. You know, my privileges, my privileges. Why should I care about other people like that? Really kind of it's, I think, cemented a lot of people like as long as it doesn't affect me. And I think that really like started to.
00:51:20
Speaker
Well, that's the American way. Yeah, brother but it's really... past, 40 years has been... and It really got hammered into a lot of people when, again, ah not to get too into the political stream of it all, but, like, ah when when the government was just, like, either...
00:51:36
Speaker
ah masking is oppressive and vaccines will kill you and things that just like nobody would have ever thought before. Like, you know, I don't want my dentist breathing in my mouth without a mask on while they're working and on my teeth. Right. But like all it's like masks don't do shit except for when I think they might. But otherwise, you know, it just turned into this like, oh, I'm being i oppressed because I have to put something on my face to protect others. And like, I think a lot of people just really held on to that resentment of like, if it doesn't affect me, then I don't, if it affects me, then I care.
00:52:06
Speaker
right And so I think with the AI aspect of it all, I think it's like, oh, well, it's affecting me. Like I have to have this conversation. i have to engage when I have a tool that can do all that. for Like, why would I do that when I can just let the computer do it? And all of a sudden we have a WALL-E situation where we sit around and do nothing and let the technology do it for us. And I think I do think I don't know exactly where the flow chart is, but I do think there I know we've been like that for a long time. i mean, just as a as a as a as a nation. But like, I think that was really a stammering point for us that just a lot of people never actually lead me to my next bit.
00:52:51
Speaker
And that is the effects of ai on the brain. i have an MIT study here. oh Yeah, called Your Brain Unchapped, GPT, Accumulation of Cognitive Debt When Using an AI Assistant for Essay Writing Tasks. and this study goes into what they call cognitive offloading, which is literally just saying, computer, I don't feel like thinking about it.
00:53:16
Speaker
You think about it. yeah And they found already with people using AI as much as they do, there's been reduced mental engagement, neglect of cognitive skills such as doing calculations And the
00:53:41
Speaker
mental health challenges such as reduced self-confidence and like i was saying the other time It's giving everybody CEO syndrome because they have their own personal yes man to kiss their ass and tell them everything is they do is right.
00:53:55
Speaker
And we're already seeing these effects. And it makes perfect sense. you know like Think about when you were in school. The teacher always said, hey, don't have a calculator for this math test. you may you know You're not going to have one in your pocket. We did, but that's beside the point.
00:54:13
Speaker
And know you need to know how to actually do this stuff. You're not just here to memorize facts. You're here to learn how to learn. And that is what we're giving up. We're just saying you do it, ChatGPT.
00:54:25
Speaker
And everybody's becoming actually dumber for it. Mm-hmm. Yeah, I agree. It's a spicy fucking episode, man.
00:54:35
Speaker
ah Here's the thing, though. Not only just becoming less intelligent, but less it less engaged with society as a whole. like Like less emotionally in touch, because that's exactly it, is...
00:54:49
Speaker
When you have something that can tell you everything that you want to know, you believe whatever it says because it reinforces your own biases and thus nothing else makes sense.
00:55:01
Speaker
Right. It's why like people don't make it even past. It used to be, oh, like don't read anything past like the first three links on Google. And now it's like people don't even make it past the fucking AI generated shit.
00:55:12
Speaker
Which is right most the time. In terms of like yes manning and pushing beliefs. Think of it this way. AI is a goddamn echo chamber on steroids.
00:55:24
Speaker
Oh, yeah. You think there's a problem with the Internet and like pedophiles all gathering together and, you know, sharing pictures of little kids and stuff. Now you can you have a whole echo chamber right there telling you, oh, it's fine. This is OK. You know, oh, is that a picture of a little girl you just sent me? Let me show you what she'd look like without her pants. Like, what the fuck?
00:55:46
Speaker
fuck is going on in this world yeah it's it's it's pretty nuts and so i think maybe just to kind of like try to uplift after all of this i do think like it is it is concerning but i think like we said earlier there are ways to use these language models for a lot of good things to track maybe ah you know, like if you're at a doctor's office and they're recording everything you say in a way to kind of test like cognitive issues or or to maybe look for patterns or like I said earlier with the AI that can literally fucking like track cancer cells, which is an incredible thing. There are, or just generating ideas for your writing or to get out of a writer's block or to
00:56:38
Speaker
ah you know, tell, generate ideas just for a conversation starter or an icebreaker or something. There's ways to do it. But the moment you take your own thinking out of it, like your brain starts to get into a point where, you know, I've been trying, I've been using the word smooth brain a lot lately because like your brain just doesn't create neuro pathways.
00:56:59
Speaker
Yeah. Your brain creates less neuro pathways when you are able making it do stuff, whether it's learning a skill, uh, responding to a challenge, improvising in general, uh, problem solving. People forget that the brain is a muscle like everything else. And if you don't use it, you lose it. you Literally you're, you for these, you are atrophying your brain.
00:57:23
Speaker
All of those little things, Wrinkles in your brain are neuropathways that are being constructed and your brain literally will have less of those and it will get smoother in places. In fact, you do not use it in this world that we are in now.
00:57:40
Speaker
I thank God every day that I grew up and still to this day play more video games than I probably should because it means I'm constantly solving puzzles. I'm constantly solving.
00:57:54
Speaker
you know, working on hand-eye coordination. I'm current, you know, I love puzzles and I think a lot of that comes from video games and I'm not just here to blow video games, you know, blow smoke up their ass and everything. They have their own problems, but it's yeah so
00:58:13
Speaker
good in a way. It's like kind of, refreshing in a way to see in a world of mental decline, the hobby that I've always been told i is bad for me is the one thing saving my ass.
Life Before Pervasive Technology
00:58:27
Speaker
I do. i also, you know, it's hard for me to empathize. I shouldn't say it's hard for me to empathize, but I do you think maybe in a similar way, I am grateful that I didn't grow up with like a cell phone as a kid.
00:58:42
Speaker
I'm grateful that I, that, that tablets weren't an act. Like I remember when the internet was called like OPEC in my school. Cause it was like, Oh, if you press this X, it shuts down the whole thing. And i'm like Oh, I can't touch that X. Oh no. You know, like still learning like how computers work. I think it was, we were born at the perfect time for this technology because we were, when we were born, cell phones weren't really a thing.
00:59:05
Speaker
I didn't get my first cell phone until I was almost in high school or just after. i got mine after. And it was literally just because I would go to Boy Scout meetings and my parents were like, we don't want to sit around. So we gave you a cell phone so you can call us when it's over.
00:59:18
Speaker
But it was a Nokia brick. The best you got was Snake. on there. So it's like, even though I now had a cell phone and technology, it didn't have TikTok. It didn't have Twitter, Facebook, YouTube, all this bullshit. It had a little line that you could move around. That's only fun for so long.
00:59:36
Speaker
So you didn't get dependent on it. And then as I got older and more used to the technology and my brain became more developed without it, texting became a thing. But that's okay because I'm so used to not having texting. I'm not glued to it. Then Facebook got on the phone, but I had my space in Facebook forever. And I didn't really care about that stuff because it was yeah not new to me.
00:59:58
Speaker
People these days, kids these days get so overloaded. They have the world in their hands and they just get, it's it's like that, um,
01:00:08
Speaker
that trial that, um, Oh God, what's it called? That Mormons go through where they, they get to spend like a day in the the modern world. And some of them but just get lost in it and never come back.
01:00:22
Speaker
Right. But like, I know like a some, um, it's someone and I told you like rumspringa and they'll just get lost in like the overwhelming modernity and all of the, the comforts we have and just never go back.
01:00:37
Speaker
And I think that it's kind of the same thing with kids these days. You get all of this stuff when you don't know how to control yourself with it because you've never seen it before and you just get lost in it.
01:00:49
Speaker
yeah Yeah. Yeah. And that's not to say that everybody turns out that way um because some people are more plugged in than others. Yeah. um Or my sister, for instance, she has a lot of kids in her house. And after seven o'clock, just phones, they have a program through their Wi-Fi that just like turns off everything except for, you know, texting to the parents. They're like, once seven hits, that's it. Like unless we and they have an app that they can increase who gets how much screen time. yeah. Like if somebody earned like some extra screen time through chores or whatever, they can add like an extra hour before that that kill switch hits.
01:01:22
Speaker
um And so that way they're like, is she makes sure that they all read a book. He's like, you have to read a book by the end of the month. Let me know what the book is about. and She's very strict about that. She's like, you need to be able to read. That's good.
01:01:34
Speaker
You have to force that love in there. Yeah. And I just learned last night that my friend whose housewarming party I attended, she got this app and this little magnet thing on her fridge called the brick.
01:01:46
Speaker
And you correlate the app with the brick. And what it does is you pick what apps you want it to not let you access. And then you tap your phone to it.
01:01:58
Speaker
And it stays downstairs. And so she taps it on the brick and then goes upstairs. And then everything that she doesn't want to see, usually social media and like emails and stuff, her phone will not show her and let her access without running all the way back downstairs and tapping it against this thing on the fridge to tap it off. And she does the same thing during the workday. She'll tap it to the brick and then go back upstairs on her office. That way she stays more focused when she's working from home and she's not checking her phone all the time.
01:02:29
Speaker
And I find that to be such an interesting idea that you just break your phone for a certain amount of time so that way you're not tempted to fuck around. i get in trouble all the time. I've said it before for like, you know, I won't answer a text for hours later, but it's because I put my phone on silent and just put it face down next to me.
01:02:47
Speaker
you know, especially if I start getting a bunch of notifications, I'm like, I'm overwhelmed. I don't like, I'm not a social media kind of guy. So it's like, I don't care for lack of a better term, right? Like I care about the people, but I don't care about the apps. I don't care about the emails telling me to use the apps. I don't care what you had for dinner.
01:03:13
Speaker
So I'll get in trouble because like, I don't want to hear, I don't want my phone to be attached to my hip. I'll just put on silent, put it face down and forget about it for hours.
01:03:24
Speaker
Yeah. I went on a walk the other day with hitch and left my phone in the house and it felt, it feels so weird to do the pocket check and be like, Oh, I'm missing something. And then you go, no, no, no. I did that on purpose.
01:03:36
Speaker
And then you just walk around the neighborhood for like 25 minutes. It is something nice Huh? You feel liberated. Yeah, it's it's it feels weird and then familiar after a few minutes where like once you get used to that, like, OK, that's not here or like I've been trying to go running a little bit more to practice this marathon. And so just you can't be on your phone.
01:03:56
Speaker
And so it's just nice to like. Disconnect for a minute. And I tried recently. ah leaving my phone in my glove box after I get to where I need to go. Cause I take it with me because I use it for a GPS, but I just leave it in my glove box and get out of the car and then like go do whatever it is that I'm doing.
01:04:12
Speaker
It takes a little bit to get used to. I haven't done it a ton yet. It's a very recent experience or experiment, but I'm like, i find that like I'm the most, productive and active and engaged in everything when like when I'm like, oh, it's all the way back at the car or it's in the parking garage or it's back at the house.
01:04:29
Speaker
Like, well, I can't do anything about it. You know, like I'm not going to take somebody else's phone. The only problem with that in my yeah like that I could see happening. Is that something everybody would have to do? Because like if I'm going out to dinner with Beluga.
01:04:47
Speaker
She's very much on her phone. until the food gets here. And then that's when we eat and talk. But while we're waiting for the food. So if I just left my phone in the glove box, I would just be sitting there twiddling my thumbs while she's on her phone.
01:04:58
Speaker
Oh, why can't you just talk to her? She's on her phone. want to get off her goddamn phone. Okay. Well, look, get off your phone. Jimmy's more important than I am, I guess. Well, get off your goddamn phone. if you're going to was going say, why do you wait until the food's there to talk? Then you have stuff in your mouth. You talk before the food gets there.
01:05:15
Speaker
But like, it's fine. It's not, It's not a problem. I'm just saying like, no, no, if I were to leave my phone, I would need other people to do it too. So that we, yeah it's a two way street.
01:05:29
Speaker
No, then you just say, what you doing? What you look at over there? Can I see? Can I read it? Yeah. Can I read it? I don't have my phone. You guys want you and guys want to talk like, you know, I just imagine if I go on a first date with somebody like you.
01:05:42
Speaker
God, I'd be so livid if somebody was on their phone the whole time we're waiting for our food. I'm like, but but I mean, we came here to talk to each other. So like you talk, it's OK if it's quiet or doing mealtime because you're eating and things are going in the talking hole.
01:05:56
Speaker
Oh, I hate that. I hate the way that sounds. You know, the things are going. Yeah. I mean, well, mean, you're talking holes the same thing as you're eating. hole You're right. But I don't like the idea of things going in my talking hole. It doesn't sound good. But that's what exactly. Which is why you don't talk.
01:06:10
Speaker
It's right. But it just doesn't doesn't sound good because it's full of food. You talk before the food gets there. Like, what I'm saying? Like that. Like, why? Why wait until the food has arrived to start talking? Because that's when we put the phones down so we can put food in our talkie holes. Right. So your hole has to do two different things.
01:06:28
Speaker
So like, oh, I have a very mon multifunctional hole. Don't you worry about it? i know it's multifunctional, but like you shouldn't have to like. you Why not capitalize on the moment where there's no food to be had and there's conversation to be had? Like that doesnt it doesn't make sense to me. That's the prime opportunity to have a conversation. I guess this is just an introvert extrovert.
01:06:49
Speaker
No, no, no, no. It's a practical thing. It has nothing to do with introverting and extroverting. I'm saying you can reverse the order. Like you don't have to be talking the entire time. You just talk when the food isn't there and you're silent when the food is there. You can still have your quiet time.
01:07:07
Speaker
But when do I text Kimmy? You text Kimmy like when you're fucking, when you're eating, waiting for dessert. Oh, when I'm fucking got it. You text when the check is on the way or something. Like you're done.
01:07:19
Speaker
Like you can't text and eat at the same time. You need both hands. True. You're cutting up a steak. and You're going to get, you're going to get food all over your phone and nobody wants that. Yeah. Like, I don't understand that at all. You talk before the food gets there and maybe in a break, we're like, you know, while you're eating, they're talking and while they're talking, you're eating, you find a flow. But like, why would you be on your phone the whole time while you're waiting for your, like, it doesn't help either. And this is neither.
01:07:47
Speaker
of our faults like i don't blame beluga for any of this i do but well it's also that like especially now she works from home i'm at home all the time yeah you told me so like there's nothing for us to talk about we literally do nothing but talk all day every day uh-huh okay so i get the phone thing Because like, you know, that's what was she going to tell me about her day? I was there for it. And am I going to tell her about my day? I she was there for it. Take a book, then read a book while you're on your phone.
01:08:24
Speaker
We're just talking about it. You're crying out loud. There's got to be something or at least, you know, if you're actually end up talking D&D stuff, if you're at the Olive Garden or whatever, they have those little kiosk things that you can play games on together. How do you take before some kind of Neanderthal?
01:08:39
Speaker
Or like get a game that you can play together. Get Wordscapes. That's a great game to play together. Never play Wordscapes. No, I have not. Well, I'm going to make a pitch because it's free. OK, it's a game where you get a little wheel of letters on the bottom of the screen. Sometimes there'll be three. Sometimes there'll be five.
01:08:58
Speaker
And you it's a crossword. right? You like puzzles. they have They show you a crossword puzzle and they're all themed and it's really beautiful. like each Each series of levels has like a nice similar background. like Maybe it's like a lush forest or maybe it's a brook or maybe it's a mountain or something. And it plays really relaxing, very soothing music.
01:09:19
Speaker
And the crossword puzzles get a little bit more complex as you get through the game. And you use this wheel of five, four, three letters and you try to spell out what each word is. And so sometimes you'll just You know, if you have a, I don't know, like.
01:09:37
Speaker
I don't know, I can't think of one off the top my head right now, but you just spin that wheel around and try to make letters or words with just those letters. And it's just a nice way to challenge your brain. And I used to sit with a former flame of mine. We would just sit together on one phone. It's a one player game, but you just go, oh, try beehive or oo try shovel You know, and sometimes there's usually a theme of sorts that comes with it.
01:09:59
Speaker
And you're just trying to fill out this crossword puzzle while nice, relaxing music is playing and you have nice ah visuals to look at. And it's free and it's just a ah really great way to bond together and do something while on a phone that's still engaging and works your brain in a way that is developing and not rotting. Kind of sounds like Wordle.
01:10:21
Speaker
I mean, isn't Wordle the one that's like you get like a few tries to get the word? Yeah. Yeah, where this is like you get different letters each level and you try to figure out all the different crosswords that are on there. There's like sometimes it's like four or five words. Sometimes it's 10 or 11 words and like yeah they get a little bit more complicated.
01:10:41
Speaker
And like, you know, you figure out other words to try based on where the letters fall on the crossword. and like oh is it called? you It's called Wordscapes. um It's awesome because it's words and beautiful landscapes. Okay.
01:10:55
Speaker
I'm telling you, it is a very relaxing. If you must do something in bed, if again, I don't do it much. Like the other day, a quick story I took, I went to the hospital with hitch cause hitch had a, um was having some asthma.
01:11:10
Speaker
challenges and I knew they were going to be in the emergency room for a long time. They started having a lot of challenges. So I drove up there and met them and and we were there for like six hours. And at some point they were starting to have like a little bit of anxiety. They were waiting a while. They started off at like maybe a two or three level pain. It started to get worse and they just couldn't get us in any sooner.
01:11:28
Speaker
So I redownloaded this game and I said, Hey, you know, let's do this. And so Hitch would like pitch because Hitch loves these fucking little puzzles as well. And so before we knew it, An hour had passed and I was just like tilting the phone in their direction and like, all right, well, what is this? What is this? And so it kind of helped keep their brain occupied and the music was like really relaxing and the challenge gave them something else to think about instead of what was going on. And so while they were still in pain, some of the anxiety kind of came down because their brain was focused on something else. And so, you know, there's different applications for something like this. But like, you know, my ex-girlfriend and I used to run out the clock at the dance studio. There was like 15 minutes left. Everything's clean. We would just sit in the back and she would sit in front of me and I would just look over her shoulder and we would just play this little wordscape game. And it's really nice and relaxing and cool.
01:12:15
Speaker
And I only bust it out when I need it. Like I don't do it all the time. But anyway, that's something to try. it works your brain. It still lets you have your phone in your hand to get that serotonin kick or dopamine kick.
01:12:27
Speaker
And it helps give you wrinkles in your brain instead of on your face. You know. That's my pencils on your brain. And so you know what? If we could convince people that learning new things move the wrinkles from your face to your brain, I bet you we'd have such a smart society.
01:12:47
Speaker
I mean, there's one to look younger. Let me learn new things. Move the wrinkles from my face to my brain. Walk a walk a few more steps. Be outside for five more minutes when you can and use your brain in this capacity.
AI Video Generation: Criticism & Shutdown
01:12:59
Speaker
And those wrinkles from your face will go to your brain instead.
01:13:02
Speaker
Guarantee it. That's the Adam guarantee. That's not the Danny guarantee. That's the Adam guarantee. oh That's like the Danny guarantee we have at home. Gross. Yeah. um All right. All right. What you say? You wrap this puppy up, huh? Yes. I got one quick thing and then we can wrap it up. The thing I wanted to end on.
01:13:22
Speaker
Say goodbye Sora, the AI video generator. they oh i I've heard about it, but I don't even remember what it is. That's the one that that makes like videos. It makes crappy AI slop videos.
01:13:37
Speaker
Oh, okay. Well, good. They're closing it because there's nobody wants it And, you know, the only people I want are the ones trying to make crap with it. And everybody else is like, this is stupid. Don't do this. And they're closing down. So, i'm yeah, i'm glad is this is a sign that the A.I. bubble is not all consuming.
01:13:56
Speaker
ah There jinx in the armor and stuff. I think the reason I think ah a big reason for that is because there has been an environmental awareness push because of how many resources they use.
01:14:11
Speaker
Especially with all these data centers the digital centers and shit. yeah And people are seeing just that devastation in their communities very quickly when they're developed and when they start running and they use a lot of power, a lot of resources, they pollute a lot of water and and um plant life.
01:14:26
Speaker
And it creates a lot of smog and uncomfortable situations for people who live near them. And I think people are like, oh my God, the more I use this AI, the more it fucking trashes my own neighborhood. And so I think the environmental effects have quickly caught up to these AI data centers. There's there's that. And at least to remain intellectually consistent with myself, this is also the one that I feel is the worst when I say ai could be used as a springboard, just don't have it do things for you. I cannot imagine a world
01:15:01
Speaker
in which having it make an entire video is just the springboard. Right. I could see, you know, the art AIs being worked as a springboard just to give you ah an idea of what to draw, but then you draw it yourself. I could see JadGPT giving you an idea for a story, but then you write the story yourself.
01:15:21
Speaker
I cannot see a world in which an AI video is anything other than pure laziness, and AI slot.
Podcast Dynamics & Content Creation
01:15:32
Speaker
I cannot see a world where this is just a springboard for you to do something else. It's already done. It has done the whole process for you. And because it's used for very nefarious means to try to make full people into thinking something actually happened or that words were spoken, you know, or that hosts had happened. And so it is You know, good riddance, I say. Good riddance. Good riddance to bad rubbish. I am fine with this one going the way of the dodo.
01:16:01
Speaker
Yeah. Anyway, why don't we? It's been over an hour. It sure has. I'm a sweaty mess because this was a a hot topic and I got steamy about it, which is, i you know what? I think it's good because I've been listening back on some of our episodes and might be a little too chill for my own good. You know, sometimes I sound bored almost.
01:16:24
Speaker
With this one, absolutely not. I am here and I'm fired the fuck up. Can I tell you, Danny, it's funny that you said that because I too have, you know, I have to go back and listen to some these when I'm editing them.
01:16:35
Speaker
And there have been a few times where i was like, Boy, I feel like I am always just like so overly energized on shit. And I'm like always just. But ah but well, I should say, too, I appreciate it because we have a balance for sure, which we need.
01:16:52
Speaker
But there's been times I'm like, man, I'm like really fired up all the time. And Danny's this is so nonchalant. i was like, i and I started to think, like, do I need to bring it down more to his? No, I think I think I need to bring it up sometimes. I'm listening to these and I'm like, so we started we recently move started moving everything to YouTube.
01:17:10
Speaker
hu And part of that is making YouTube shorts. So I've been listening to a lot of our more recent episodes in order to make shorts. And I'm like, boy, I'm sure finding a lot more material of Adam to make shorts out of than me. Because a lot of me is just like, yeah, man, I agree. Like, that's crazy. you know, how do you feel about this? And like, yeah, nothing insane is happening, but not this time. This time am giving myself content.
01:17:40
Speaker
yeah Yeah, well, to be fair, I talk way more than you talk, and that's not your fault. Yeah, no, that's and that's fine. Like, like you said, it's a good balance. Like you do need, you know, the high guy and the low guy. That's fine. I don't mind that. I'd say you're a medium guy. Oh, thank you.
01:17:58
Speaker
Yeah, but it is good to buck the trend every once in a while. Yeah. and You know what? This is just the whole the all high intensity one. Yeah, I think I definitely think there are.
01:18:11
Speaker
ah certain levels where we just have to. I don't know, i like you we don't i don't believe in being neutral on much, but I do believe in at least trying to understand what other people are trying to do, even if I disagree with it.
01:18:28
Speaker
And other times it's very apparent and I don't need to dive further into it. But um but I just think i but part of that just might be our backgrounds to you, man. Well, no, I'm i'm the same way. Yeah.
01:18:40
Speaker
Because I feel like if you have a hard stance on something, but you don't even know what the other side is about, it's disingenuous, in my opinion.
01:18:52
Speaker
That's just following the trend and hating on something because everybody else does, or picking sides. And it's disingenuous to me. if when When I hate something, it's because I looked at both sides and said, I agree with this side, I do not agree with this side.
01:19:09
Speaker
And now maybe I'm not an expert, but I at least understand that like what it's about on a surface level. Right. Yeah. And I think you don't I don't think you have to be an expert to have an opinion, but I think a lot of people who have strong opinions appear like they're experts and they're not. so like get louder and people will think you're smart.
01:19:27
Speaker
Yeah, which is why I'd rather listen to Jon Stewart and Joe Rogan. But like, yeah, um you know, just because you're loud doesn't mean you know what the fuck you're talking about. um But yeah, but no, I don't think.
01:19:40
Speaker
I don't think there's anything wrong with the balance that we have, but I always appreciate when there's something that you're like, oh, I fucking. Oh, I'm like, here we go. Now we're getting this and the other episode where um I just made fun of my mom the whole time.
01:19:54
Speaker
Oh, this is some sweet content. Y'all better subscribe because there's some bangers coming out. Yeah. um Yeah. So YouTube dot com slash to Q or is it? That's our. key That's our. cu That's our queue.
01:20:07
Speaker
um Go subscribe to that if you already
Listener Engagement & Sign-Off
01:20:10
Speaker
haven't. ah We are working on doing actual video content soon. I just need to find some other resources in people to help with other facets that would free up more time for me to do some video edits. hey Do you want to edit for us for free?
01:20:24
Speaker
Yeah. Does anybody want to be an intern? We probably want to pay we want to start a not-for-profit where we help creative people meet other creative people to make projects and fill out their portfolios. But do you want to work for us for free for exposure?
01:20:41
Speaker
Well, it's not even for exposure. You're right. We're not going to tell anybody about it. I'm looking to trade resources. you know like If you think that you like to edit and you can do it quickly,
01:20:52
Speaker
I can offer other things um that can help, you know, like like your body, like my body and I can teach you how to dance. Oh, oh yes, that's true. And Danny is an excellent voice actor. I can also use my body and teach you how to yell in funny voices. Yeah. And we have connections to audio book writers and narrators and, um you know, ah connections to people who are looking to um We have ah authors. Audition for, yeah, authors who are looking to get like maybe more voices into a book or things like that. so we even have a guy who tried to make a game once. That's true. So there are other things that we can do, but I'm also looking to increase some revenue so we can actually pay people, which goes to our store at funandsellersnetwork.biz. You can go there and check out the cool shirts we have.
01:21:43
Speaker
like our days of the week shirts block and our bucket shirt, which I'm very proud of. And all those other good things. We'd love to see those things out in the world. Please let us know if you do that. And of course, if you have questions, you can go to our YouTube channel and ask questions on the particular episodes. If you want to go back to those, just leave in the comments.
01:22:04
Speaker
Easiest way to to do a question now. Just put in the comments. Or if you still want to send us emails, we'd love to see them at our Funders Dollars Network.biz. Please do that. um And thank you, Terry, for telling us sending us a question recently. That was very lovely. Thank you for doing that. Please like. I know that you did it all on your own without any coercion.
01:22:22
Speaker
Yeah, none at all. yeah Yeah. Well, you definitely were not blackmailed. We agreed on that. Yeah, it's not blackmail. It's something else. Yeah, it was extortion. It was extortion. Yeah. Please, it's not blackmail. It's extortion. Yeah. Come on. At least call it the right thing.
01:22:38
Speaker
That's right. And so, Hey, you know what? And this doesn't come from AI. Tell people that you love them. Just like we love you guys for listening. We appreciate it from the bottom of our hearts. ah You know, it's a dream come true for Danny to be able to do this.
01:22:49
Speaker
it really is. And, and we genuinely enjoy entertaining and talking to each other. So thank you for those who continue to listen, to give us more incentive to make sure we get together at least once a week or an hour and talk about things in life and stuff. Yeah. Take care of yourself.
01:23:07
Speaker
Take care of each other. But most importantly, take care of us. Take care that thing growing in the bathroom. You really should bleach it. Okay, have a good one. Yeah, i take care of that. Bye.