Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Toxic Parents, Digital Addiction & The Truth About AI image

Toxic Parents, Digital Addiction & The Truth About AI

E280 · Unsolicited Perspectives
Avatar
13 Plays13 hours ago

Are you glued to your phone, overwhelmed by social media burnout, stressed about AI taking jobs, or stuck in toxic family power struggles? This episode dives into digital detox culture, AI basics, and real-life parent–child boundary issues all in one.

Bruce and his sister J. Aundrea (Master’s in Data Science & Analytics) tackle a serial squatter scam, the rise of digital detox culture, and a viral Reddit post where controlling parents use an emotional support dog as leverage.

Jay breaks down AI in plain language—no Terminator, just real tools like spam filters, Netflix recommendations, and agentic AI that can plan multi-step tasks. They cover machine learning, data analytics, and which jobs AI may replace versus where humans are still essential.

Along the way:

•  Why people mute notifications, delete apps, and book digital-free vacations

•  How to set boundaries with tech, work, and toxic parents

•  Practical AI uses for emails, homework, planning, and parenting

If you’ve felt burned out by constant connectivity, anxious about AI, or stuck in family drama, this episode will make you laugh, think, and rethink how you use your time, tech, and energy.

🔔 Hit that subscribe and notification button for weekly content that bridges the past to the future with passion and perspective. Thumbs up if we’re hitting the right notes! Let’s get the conversation rolling—drop a comment and let’s chat about today’s topics.

🚨 Get access to the Uncensored conversations — raw, unfiltered, and unapologetically bold.

💥 Tap in for exclusive episodes, spicy extras, and behind-the-scenes chaos you won’t find anywhere else:

🔓 Unlock it on YouTube Memberships: https://www.youtube.com/channel/UCL4HuzYPchKvoajwR9MLxSQ/join

💸 Back us on Patreon: patreon.com/unsolicitedperspectives

This isn’t just content. It’s a movement. Don’t just watch — be part of it.

Thank you for tuning into Unsolicited Perspectives with Bruce Anthony. Let's continue the conversation in the comments and remember, stay engaged, stay informed, and always keep an open mind. See you in the next episode! #podcast #mentalhealth #relationships #currentevents #popculture #fyp #trending #SocialCommentary 

Chapters:

00:00 Digital Detox, AI & Who Really Owns Your Pet? 🤖🐕💥

00:20 Welcome to Unsolicited Perspectives 🎙️🔥

00:49 Sibling Happy Hour: Sips, Laughs & Sibling Shenanigans 🍹😂

03:11 Investigative Journalism Exposes Serial Squatter 📰🔍⚖️

07:10 Ban Her From Everything—Even Blockbuster 🚫🎬😤

09:15 COVID Excuses Don't Cover 2018 Evictions 🦠📅❌

10:24 The Digital Detox Movement Is Here 📱🧘‍♀️✨

13:33 Recalibration Over Revolution: Tech With Purpose 🔄💡🎯

15:15 Digital-Free Vacations Are Trending Now 🏖️📵🌴

19:34 Interviewing My Sister: The AI Expert 🎓🤖💬

24:03 AI Learns From Billions of Pages of Text 📚🧠⚡

29:43 Data Is Fuel, Machine Learning Is the Engine ⛽🚗💾

32:28 Will AI Replace Your Job? The Truth Revealed 💼🤔🔮

36:30 How to Start Using AI Tools Today 🚀💻🎯

40:57 The Future of AI: Agentic Systems Explained 🤯🔮🛠️

47:22 Parents Threatening to Steal Her Emotional Support Dog 🐕💔😢

51:14 When Kids Go No Contact & Parents Claim No Idea Why 🚪👨‍👩‍👧

56:35 Dogs Are Expensive—This Girl Pays for Everything 💰🐶📊

58:26 Final Thoughts: Don't Fear AI, Embrace It 🌟🤖✅

Follow the Audio Podcast:

Apple Podcast: https://podcasts.apple.com/us/podcast/unsolicited-perspectives/id1653664166?mt=2&ls=1

Spotify: https://open.spotify.com/show/32BCYx7YltZYsW9gTe9dtd Beat Provided By https://freebeats.io Produced By White Hot



Recommended
Transcript

Introduction & Episode Preview

00:00:00
Speaker
Digital detox, the world of AI, and who really owns the pet. We gonna get into it. Let's get it.
00:00:19
Speaker
Welcome. First of all, welcome. This is Unsolicited Perspectives. I'm your host, Bruce Anthony, here to lead the conversation with important events and topics that are shaping today's society. states to Join the conversation and follow us wherever you get your audio podcasts.
00:00:33
Speaker
Subscribe our YouTube channel for our video podcast, YouTube exclusive content, and our YouTube membership. Rate, review, like, comment, share. Share with your friends, share with your family, hell, even share with your enemies.

Episode Schedule & Holiday Break

00:00:47
Speaker
On today's episode, it's the Sibling Happy Hour. I'm here with my sis, Jay Andrea. We're going to be dilly-dadding a little bit. Then we're going to be talking about And then we're going to be talking about a Reddit post about a family fighting over a dog.
00:01:01
Speaker
But that's enough of the intro. Let's get to the show.
00:01:11
Speaker
What up, sis? What up, brother? I can't call it. I can't call it. I just want to let everybody know that this is an episode that will be airing on the 12th.
00:01:23
Speaker
We will be airing episodes on the 16th and the 19th. And then we're going to take the rest of the year off for the holidays.
00:01:37
Speaker
We're going to enjoy ourselves and then come back January 6th for an episode. So we just want to let everybody know that going be taking a little break. Don't worry, you're going to get this episode, the normal episode for Tuesday, and then another sibling happy hour.
00:01:55
Speaker
Then we're going to take a little break. It's a break for us, but we are fiercely recording content So if you go on our YouTube page, even though we'll be on on vacation, there'll still be original content on the YouTube page. So if you miss us a little bit, you can check us out there.

Squatter Scams Discussion

00:02:15
Speaker
But that's what's going on because we need a break. And we're going to get into why Jay needs a break in the second segment. But Jay, there's something that we needed to talk about.
00:02:26
Speaker
oh Lord. That we talked about last episode. Okay. And there was a squatter situation. And there's no more to the story. The more you send me on this, the angrier I get about this situation.
00:02:42
Speaker
If anybody remembers last episode, i have a personal grievance with squatters. Not that I've ever had a squatter situation myself. It's just the violation.
00:02:58
Speaker
It's such a violation of people's personal space. Like... Oh, get get it to get into how this lady is truly the worst. All right. So at first I had just read the story, but DC news did some investigative journalism.
00:03:17
Speaker
All right. This is what journalism is. yeah Yeah. You investigate and you tell both sides of the story. Yes. So they actually got an interview with the squatter.
00:03:27
Speaker
be And I had seen a clip of the interview and the squatter had been saying, Hey, look, I'd have made some payments through Zelle, some through Apple

Digital Detox & Social Media Breaks

00:03:39
Speaker
Pay, some were late, but she kept sending me an invoice.
00:03:44
Speaker
I'm a tenant. And I was like, oh, okay, no. I may have jumped too quickly on a landlord's side and jumped down the squatter's throat when clearly the squatter is saying,
00:03:58
Speaker
hey, I've been paying, they've been giving me an invoice, and i'm the one who filed paperwork with the courts, not them. Till investigative journalism did some more investigative journalism and found out that this is not the first time that she has done this. This is the third or fourth time that she has done something like this. She owes one company $50,000 in back rent. Not only that, she bought a car, paid a down payment, and didn't pay they'd make any more payments. This person is a literal leech.
00:04:36
Speaker
Yeah, a serial criminal. Yes, a literal lead. A career criminal. Her, everything, i am positive that everything this person has, has been obtained through some sort of scam or deceit.
00:04:50
Speaker
h Like, please, if you see an application from this woman, immediately throw it in the garbage. Well, she went by a different name.
00:05:04
Speaker
back in the day. of course And plus she booked it on Airbnb. Why did she book it on Airbnb? Because she got so many evictions on her record that she can't, she not only has evictions on her record, she couldn't possibly have good credit.
00:05:19
Speaker
No. Couldn't possibly have good credit. now So she can't get an apartment even if she wanted to. She had to go to Airbnb route to get in. And and just to repeat it. She's repeatedly, repeatedly done this and takes no responsibility. I saw an interview. nine No responsibility. None.
00:05:39
Speaker
And I was like, oh, she's Trumpian. Yeah, that's just the lack of accountability. and And when confronted with these things by the reporter, oh, no, that's not accurate. That's not. a No, no, no.
00:05:55
Speaker
What you don't understand is these are public record. These court records, they're public record. Anyone can get these records. Anyone can simply search your name and find these documents and find these records.
00:06:10
Speaker
And they're not incorrect. They're not inaccurate. They're court filings. There are liens and judgments against you. not to So, yeah, it's not just the evictions. There are also several liens against her. Like, this is a person who is just a career scammer and knows exactly what she's doing, knows the law, and is purposefully preying on people. So Airbnb needs to...
00:06:41
Speaker
completely drop her, like ban her from the site. Any site. I don't care what it It needs to ban her from the site. Ban her from everything. banner Just ban her from everything.
00:06:53
Speaker
Block her old Blockbuster membership. Just ban her from everything. I don't care. Pinterest. Etsy. I don't care what it is. Take away her Boston Market frequent flyer, frequent user cards. Take away everything. Everything got to go.
00:07:08
Speaker
Everything. She needs to lose everything. There needs to be no place that this person can hide. You know what the kicker for me was during this video. i don't know how much of it you watch, but the kicker for me was is that the news group saw in the crack of the door because she wouldn't answer any more questions after the reporter was calling her out on her lies. Yeah. Ran, saw that she was running and oil and candle business. Oil? Yeah. so Essential oils. Essential oils. Essential oils. Yeah. Yeah. yeah yeah
00:07:42
Speaker
Yeah. Please. Mm-hmm. Please. Get an actual job. I mean, she's an entrepreneur. No. An entrepreneur. No. No.
00:07:54
Speaker
No, you're not. You're a scammer and you're a criminal. And you need to be called out, Shadija Romero. You're a scammer and you're a criminal.
00:08:04
Speaker
First of all, I knew that she wasn't right because I ain't never heard Shadija at all. Oh, the thing that I hate, though, because we got family members. She's a Bowie State graduate, though.
00:08:17
Speaker
and And running a nonprofit. Yeah. i Trying to save the world and scam it at the same damn time. You you sent me that the web website for that nonprofit. I wanted to spam the contact us so badly.
00:08:31
Speaker
Like, you don't have it. I hope this the namesake for this nonprofit, who is one of the founders, I hope that they can see now what kind of person they've gotten into business with and that this is...
00:08:47
Speaker
I mean, it's just, it's, and it was her, it was her like narcissism and like. Lack of accountability. Lack of accountability. Yeah. But it was more of like this entitlement.
00:09:02
Speaker
No, I'm a tenant. No, you're not. And you know what you're doing because you've done it several times before. Mm-hmm. You know you don't pay your bills.
00:09:14
Speaker
Don't pay the bills. You know you have no intention of paying your bills. She was like, well, one time the the eviction, it was during COVID.

Social Media Break for Finals

00:09:23
Speaker
and Right. And money wasn't coming in everybody wasn Everybody was doing that. And also, I know for a fact in this area, because I live here, that a lot of buildings, rental buildings, were working with people that were out of work or yeah not getting money through COVID.
00:09:44
Speaker
Yeah. Explain, explain doing it back in 2018. You can't, you you can't because you knew what you was doing. oh But ladies and gentlemen, the reason why I kept sending my sister this is because I knew she was going to agitated. It was literally Well, not literally. I need to stop saying literally. Figuratively, it was me going back to when we were kids, putting my finger in her face saying, I'm not touching you. I'm not touching you. Because I knew she was going to get agitated. But I specifically did it to get her mind off of different stuff that she had going on because she needed a break. I was being a jerk for a reason. Jay,
00:10:22
Speaker
jay You are in the process of doing this right now.
00:10:28
Speaker
But there's this thing called a new wave digital detox and it's gaining traction. Now, this is a little bit different than people going cold turkey. We're not going to do social media anymore.
00:10:43
Speaker
This is this is a digital detox or intentional disconnection. So I got this from the Wall Street Journal, an article. Basically, the core argument of the article is that there's a growing share of people saying that they overuse their phones and are experimenting with limits, but they are abandoning but they're not abandoning their devices. They're trying to use them more deliberately. Right. So that the Washington Street Journal described this new ways of digital detox behavior of people muting their notifications, setting app limits and taking short social media breaks or even going on no phone vacations. This was driven all by anxiety and distraction and burnout from constant connectivity. Now, I know for me personally, I have disconnected the notifications to the Instagram.
00:11:34
Speaker
Because I don't know what's going on, but it's a good thing. But we are getting more views and likes on our rails, and I'm getting the notifications for those.
00:11:44
Speaker
And I don't like my phone buzzing all day long. So some people have been a little pissed off because I haven't been answering DMs like I have been in the past. But I had to disconnect the notifications for not only that, for other things. I know that you just told me before we hopped up on here, you ain't been on social media in 30 days.
00:12:02
Speaker
Yeah, yeah. And it and and it it was intentional. You know, i just finished my degree, but I was in the thick of like finals and it was a lot of, ah you know, a lot of things going on and it was too much of a distraction. So the app is still there. I'm still getting the notifications. I see it.
00:12:21
Speaker
I see the DM. I'm intentionally not logging in and checking because it's too much of a distraction. And and I just needed to to disconnect in order to focus on things that are more important right now.
00:12:38
Speaker
See, you do it differently because I can't see the notifications. I have to answer them. i don't It's something. I don't know i know if that's OCD or whatever, but that's something. I see a note. I can't have notifications on my phone, so I just turn them off.
00:12:54
Speaker
Gotcha. Yeah. No, I still see it. i I see that I have DMs. I just I just am not checking them. So like, yeah, I see that I have text messages.
00:13:05
Speaker
I see them. ah But virginia a yeah, I can't. I'm not getting to them. I'm getting to them, but just up not as quickly. Right. Like I'm not I'm not immediately I'm not immediately available in a way that I have been in the past because I have.
00:13:24
Speaker
different priorities right now. And so I just need to focus on that.

AI Myths & Practical Applications

00:13:28
Speaker
Well, it feels like that this article, the Wall Street Journal article, is really touching on a lot of both your and my actions recently because it emphasizes that this is less an anti-tech revolt and more of a recalibration. Users want technology to feel more purposeful and less compulsive rather than disappear from their lives. So people still you know people still want to get on social media and still do these things, right but just not at the rate of which that they were doing them and the fact that it was controlling their lives. I know for a fact I was sitting there watching the Diddy documentary, and when it got to parts where, like, okay, this is a rehash, I know what's going to go on here, I know this whole story, I was on my phone.
00:14:13
Speaker
It was in the background. And when it something new came on, I would change my attention. But I was on my phone and I was like, you know, let me be present and be in the moment. And so I've been intentionally like when I'm hanging out with friends and things of that nature, putting my phone away. It goes in my pocket and does not come out again. Yeah.
00:14:33
Speaker
yeah Yeah, it just shouldn't take... It shouldn't be so compulsory, right? Like, I have to check my phone. I have to check these messages. I have to check these DMs. No, they can wait.
00:14:46
Speaker
It can wait. You don't have to be immediately available, even if you are available. just because i'm Just because I'm not doing anything doesn't mean I'm available. Doesn't mean I have the bandwidth to be available. So it's just being more purposeful about...
00:15:03
Speaker
when you engage online or digitally. Yeah. but But this is a new thing, what I'm about to talk about now. So that there are digital free vacations and detox retreats.
00:15:16
Speaker
And what these are, because they've been spiking recently, are travel brands are now actively marketing trips, promising help with unplugging from work, email, and social media. And yeah corporate trends reports cited alongside this coverage find that a measurable minority of travelers now intentionally avoid news, limit device time, or log off social platforms while on holidays compared for previous years. So I know this is something that I and you have talked about on this podcast a lot, ah how people don't take vacations. Even when they go away, they're still working. And this is a concentrated effort to
00:15:57
Speaker
I'm going to unplug. Yeah. And I love it. Yeah, absolutely. i mean, that's ah it's part of a capitalist society, a corporate culture, right? It's like we always have we have this feeling of that we always have to be connected and we have to always be available. But what that does is just lead to more anxiety, more stress, more burnout, affects your physical health, your mental health, your emotional health. You have to take time to just be by yourself or with your family or with your friends and just be yourself. Like outside of who you are digitally, who you are professionally, just be yourself and just present in the moment and just experience, I don't know, joy. Yeah.
00:16:50
Speaker
Well, yeah, well, so what the article is basically, to summarize the article and what it's basically saying is that this is not a total abstinence, like I said, but it's more of an experiment with boundaries, right? No phone dinners, app-free weekends or social media Sabbath days.
00:17:10
Speaker
All of this is to reduce stress and regain focus. And for tech companies, tech and media companies, The Wall Street Journal article suggested that this is a strategic opening, right, that products and platforms that help people control attention avoid endless scrolling or build in, ah you know, things that eliminate mindless use may gain favor, especially to this detox mindset system.
00:17:41
Speaker
spreads And I know that it's spreading because and i know some people who are haven't been on social media in more than a month. I know somebody close to me who hasn't been on social media since September. And they were just like, I'm taking a break.
00:17:56
Speaker
yeah And I was like, you know what? I get it. I understand. it I know a lot of people. I know a lot of people who are just like, oh, I don't watch the news. Now, m ladies and gentlemen, I get it.
00:18:08
Speaker
Yeah. news is horrible. It's really terrible. But you've got to pay attention to what's going on. Because if you don't, that's how they sneak things in underneath and get you. I'm just letting you know.
00:18:20
Speaker
That's how they get you. That's how they get issue you. You've got to pay a little attention. But I understand, you know, you you want to take a little detox every now and then. Yes. That's fair. That is That's fair.
00:18:32
Speaker
yeah But you specifically have been on a detox. Mm-hmm. For about a year and a half. o And the reason why, if anybody has been following this podcast at all, is because my sister has been going to grad school.
00:18:51
Speaker
Yes. Well, she just finished. I'm done. She's about to graduate next week because I'm going to there. Mm-hmm. And, Jay, tell everybody what it is that you went to school for again. I have my master's in data science and analytics. That's right.
00:19:06
Speaker
Yes. We're going to be talking about AI. And we're going talking, I'm going to be asking my sister questions. I'm going to be interviewing my sister. We're going to get into that next.
00:19:22
Speaker
Ladies ladies and gentlemen, I'm so proud. to be interviewing my sister, a graduate, master's degree.
00:19:36
Speaker
Now, I am the least educated out of all the siblings because both of my siblings now have master's degrees. I'm cool with that. My sister got her master's degree and I'm going to be down there in Atlanta celebrating with her.
00:19:51
Speaker
But I don't know shit about AI. I mean, I do. I'm more techie than most. Yeah. But I don't. yeah And so I'm going to be asking my sister because this is part of what she did when she was in school.
00:20:06
Speaker
And because she's given me such great advice on help on using these AI tools to help better the show, be asking her some questions about AI. So the first thing I want to do is say, you know, Jay, thank you for coming on the show. Thank you for answering these questions with for us.
00:20:21
Speaker
Well, I'm happy to be here, happy to be here. All right. All right. So let's start with first. All right. And we're going to and base level conversation and my sister and I be a complete science nerd.
00:20:33
Speaker
Yeah. When people hear AI, they always think Terminator 2. They think robots. They think taking over the world. In the real world, what is AI actually doing today?
00:20:48
Speaker
Yeah, so that's exactly right. Like, a a lot of people still think of AI as robots from science fiction movies. They're these machines that become sentient somehow through our own misadventures and they take over the world and all of that. Okay, the reality is ah much, much less dramatic, y'all. like And it's actually a lot more practical. So...
00:21:15
Speaker
Most of the a i that we interact with today, it is, it is, it lives inside software, right? It's not a robot. It doesn't have a body. It lives inside software. And it, all it does is help us search faster, write faster, decide faster and automate small tasks that would normally take us a lot of time. So for example,
00:21:43
Speaker
Think about the last time you searched for something on Google, right? And got the exact answer you needed, like immediately. or or Or the last time, like say for example, your phone auto-corrected a word. Now sometimes, that listen, Apple, I meant hell.
00:22:04
Speaker
not hail or correct or heck, They'll be autocorrecting my cuss words. But just like the last time your phone autocorrected a word or like it it'll suggest a reply to a text or how spam gets filtered out of your email.
00:22:22
Speaker
This is all things that are powered by AI. So it's not thinking in a human way. So you don't have to worry about AI taking over the world.
00:22:34
Speaker
All it's doing is recognizing patterns and making predictions based on what it's seen before. So when you have tools like chat GPT or co-pilot or Gemini. These are tools.
00:22:51
Speaker
They're not artificial intelligence. Not really. What they are, are large language models. That just means that they are trained on an enormous amount of written text so that they can recognize how language works.
00:23:08
Speaker
And how and so that they can learn about a bunch of different topics. So it doesn't understand ideas or meanings the way people do. It just predicts the next word.
00:23:25
Speaker
The way your phone predicts your next text. It's just at a much more advanced level. That's all. Yeah. It's not thinking. It's predicting. So not the robots. All right. It's not a robot. Because I thought it was T-100 and T-1000. No. thought SkyNet was right around the corner.
00:23:43
Speaker
It's really. No. It's really not. Okay. Yes. In your specific program, what did you learn that surprised you about hey ah how AI really works behind the scenes?
00:23:56
Speaker
i I think. I think the thing that surprised me the most is just how much learning is based on just exposure to massive amounts of information.
00:24:10
Speaker
Like it's so much, I mean, terabytes and petabytes of information. i don't even know what petabytes is. Millions and millions of gigabytes, like huge amounts of information. You can think of it like like 500 billion pages of written text, like huge amounts of information. So these systems don't know facts the way people do.
00:24:38
Speaker
Like I said, they learn they're learning patterns from reading huge amounts of material, books, articles, web websites, all of that,

AI in Everyday Life

00:24:47
Speaker
right? And so...
00:24:49
Speaker
and And I mean huge. Like when I say huge, it's like on a scale that it's really hard for the human brain to even picture how much information is being used to train these models to to help you with daily tasks. So like a simple way to think about it is people learn through repetition, right?
00:25:13
Speaker
So if you hear... If you only ever hear one song once, you probably won't remember it. But if you hear it a dozen times, you start to recognize every beat. AI learns the same way. So instead of hearing something a dozen times, it's learning patterns from millions or billions of of amounts of data.
00:25:35
Speaker
So, like, it just surprised me that it's not... The AI is not just sitting on a giant list of stored answers. It's not pulling facts off a shelf like like a librarian.
00:25:49
Speaker
It's generating responses in the moment based on patterns it learned during training. And that's why that's why it can explain things in so many different styles and in so many different ways.
00:26:04
Speaker
but And also why I can sometimes be wrong in a way that sounds super confident. Right. Now wrong. Yes. All right. So you've kind of given a few examples like helping us write better, ah helping us do things better.
00:26:22
Speaker
But can you give more of like an everyday example of AI that most people use without even realizing it? Yeah, like the crazy thing is that a i is so deeply embedded in our everyday life that you are using it without even realizing it.
00:26:44
Speaker
So for example, if you have Face ID on your phone, that's an AI recognizing your face. If your email has a spam filter, like I said earlier, that's AI spotting patterns that look like junk mail.
00:26:58
Speaker
When Spotify or Netflix recommends a song or a show or a movie that ends up being exactly what you wanted, that's because AI is learning your habits.
00:27:11
Speaker
Like even even things like Google Maps, Waze, Uber, like these things like rerouting you around traffic, that's AI analyzing real time data from thousands of other drivers. will you order When you order from Uber and the price and the pricing updates and the arrival time prediction that know tells you when your car's at, that's all AI.
00:27:38
Speaker
When you go onto a website, And there's a chat bot to answer questions about whatever the product or the service is, that's AI. it's It's already become part of just how things work now.
00:27:55
Speaker
Like that, it's not futuristic. Like it's already deeply embedded in the infrastructure of our of our daily life. You touched on this a little bit already by saying that, you know, AI is learning basically by all of this massive amount of information.
00:28:16
Speaker
yeah When people hear terms like machine learning or data analytics, how do those actually connect? What role does data play in making AI smarter or smart?
00:28:30
Speaker
Right. So this is an analogy that I use. Data is the fuel. Machine learning is the engine. Data analytics is the dashboard.
00:28:42
Speaker
So if you put bad fuel into a car, it doesn't matter how powerful the engine is, you're not going go very far. In the same way, AI is only as good as the data it learns from.
00:28:58
Speaker
So the data is the foundation, right? Machine learning is what allows AI to detect those patterns in the data. It's the actual algorithm that it's using to detect those patterns in the data. So, so like, for example, it might learn that people who...
00:29:21
Speaker
buy diapers, also often buy baby wipes. It doesn't understand why a baby needs wipes. It just sees that those two things keep showing up together in the data.
00:29:33
Speaker
And then data analytics is what we use as people to look at that same data and ask deeper questions. Like, why is this happening? What does this mean? What should we do about it? That's where the human judgment comes in, right? So like, you only know your speed and when you look at that dashboard,
00:29:53
Speaker
and see the speedometer. You only know you need to check your engine when you look at that check engine light comes on. It's the the dashboard that gives us all the insights, right?
00:30:05
Speaker
So a really good like example is in healthcare, right? So doctors collect data from patient records. Machine learning can help spot early warning signs for diseases.
00:30:18
Speaker
But the doctors use data analytics and experience to decide the treatment plan. So all three of these things are working together. yeah Yeah.
00:30:29
Speaker
So... Going back to that original question, people shouldn't fear ai like robots in Terminator. No. should fear the people that are programming the AI, because if they're not programming the AI correctly, o the AI can give out bad data.
00:30:53
Speaker
It can give bad information, right? Bad information, yeah. Yeah, so... data That's what data is, is information. I said it right. Yeah, so it it's just like learning from a bad teacher leads to bad habits, right? If AI is trained on bias or incomplete data, it's going to make biased or inaccurate decisions. So yes, data quality matters just as much as the technology. And there are a lot of, you know, checks and balances in place in terms of like, How do we use AI ethically? How do we use AI safely? There are people whose whole job is to make sure that they audit these systems to ensure that they're safe and they're being used ethically. Yeah.
00:31:44
Speaker
yeah All right. One thing that's happening out here in the world with the rise of AI. I've been on AI since chat first came out like three years ago.
00:31:58
Speaker
But what people are experiencing is fear. And one of the main fears is that a lot of people think AI is going to replace jobs.
00:32:09
Speaker
yeah From what you've learned, what jobs is AI actually good at and where do humans still do better? Yeah, so ai is really good at things that are repetitive, structured, and rule-based.
00:32:26
Speaker
So that includes things like scheduling appointments, answering common customer service or user questions, scanning documents, things like that. It's fast. It doesn't get tired. It can work nonstop.
00:32:43
Speaker
that's That's where you can put it to work automating some repetitive task, and it's perfect for that. But... Humans are still needed, okay? Because humans still dominate, right, in areas that require emotional intelligence, creativity, ethical reasoning, leadership, complex judgment and decision-making. Like, you wouldn't want an AI delivering difficult medical news to a family.
00:33:16
Speaker
You wouldn't want AI deciding prison sentences or handling crisis negotiation without human oversight. Well, I might want i might want AI handling president's sentences because some of these judges out here is a little crazy. And they're hammering. They're throwing the hammer down when the hammer don't need to be thrown down.
00:33:36
Speaker
But then you run into, ok ai is being trained on massive amounts of data, right?

AI Automation vs Human Skills

00:33:43
Speaker
If the data is biased because our actual system is biased, right? Like certain people are criminalized more often than others, yeah then AI starts to learn that these people are criminals.
00:34:00
Speaker
Okay. All right. So you don't want that. Yeah. You're right. You're right. You're right. You're right. I mean, so, yeah, people being worried about AI taking over jobs. You don't. We've seen this before, like historically, right? It's just the technology changing. Some of the things you're describing...
00:34:21
Speaker
There are people that do those jobs that are absolutely going to be replaced by AI. You don't need a scheduler anymore. Right. You may not need an administrative assistant anymore because you can have ai answer phones, answer emails, do the scheduling for you. File, file, if the files are digital and not physical,
00:34:46
Speaker
file and organize your files for you. Yeah. so We also don't have rental store clerks, rental video rental clerks anymore, right? Because everybody's streaming everything. but But ah a ah the only thing that technology is doing is reshaping work.
00:35:05
Speaker
It's not eliminating human relevance. As soon as one job is rendered obsolete by technology, new jobs that that technology forces into existence are being created.
00:35:19
Speaker
I like what you said there. Isn't eliminating human relevancy. yeah It is eliminating jobs, but not human relevancy. It's just moving the human from one spot.
00:35:31
Speaker
yeah We got something else for you over here. If you're willing to be trained and and learn something new and come out of your old way of thinking, yeah we got this new job over here for you and you're probably going to end up making more money.
00:35:44
Speaker
Yeah, the challenge is just helping people adapt through education and upskilling. like But just because this job can now be automated through AI, well, it it ends up creating three more jobs that one of them is oversight over that AI. One is, you know, like...
00:36:06
Speaker
it just It ends up creating other jobs. So you're still very, very necessary people. Like, it's just reshaping the the landscape.

Exploring AI Tools & Ethical Use

00:36:16
Speaker
That's all.
00:36:17
Speaker
Human beings are necessary, ladies and gentlemen. Besides, yes the robots get their fuel off of humans. We're needed to be in existence. No, I'm just kidding. All right.
00:36:27
Speaker
Exactly. Like the Matrix. Just like the Matrix. From a beginner standpoint, if somebody wanted to start exploring AI tools, yeah you've already kind of explained that we are already using them, but intentionally exploring AI tools. What's some of the simplest and most practical ways for them to get started?
00:36:46
Speaker
Yeah, there's so many really, really easy ways to do it. I mean, the first is to like literally just start. Just start using it in everyday life. Ask a chat GPT or co-pilot or Gemini to help you write an email.
00:37:03
Speaker
or to brainstorm, i don't know, it's Christmas season, brainstorm gift ideas or plan a trip or to help you study for a test. You don't need to understand how it's built in order to benefit from it, right? Just like you don't need to, the most people don't know how the engine works in their car, but they still drive it.
00:37:24
Speaker
like And I think it's something with pistons, I think. I don't know. don't know. know they from Detroit because I know Isaiah was a Piston, Joe Dumas, Dennis Rodman. They were definitely... Detroit Pistons. I remember them in the jerseys. Yeah, I definitely remember that. I remember that. But if you do want to go a little deeper, YouTube has tons of beginner-friendly videos that explain AI and really...
00:37:50
Speaker
but in like plain language, you can go on course sites like Coursera or Datacamp. They offer like hands-on courses where you can learn in a really, really accessible way. You don't need to come from a technical background. Like a lot of it is about connecting people to AI in ah in an accessible way because it is going to continue to be integrated more, not just in our personal lives, but certainly our professional lives. And so you need to learn how to use it and how to make it work for you. But yeah, there's so many resources out there. But really, honestly, the first way is to just try it. Like if you just if you have an email and you are not 100 percent sure how to respond to it, drop that email into the chat and then just say, I'm not sure how to respond to it. I want to say something like this.
00:38:46
Speaker
Can you draft a response and see what comes out? Just start. Yeah. And also, i give you two other practical examples that can work. I used Copilot earlier today on my phone and didn't realize that there's a button I can actually talk to it and have a conversation to help walk me through a situation that I had. And the situation was, is I had a lineup of tasks that I needed to complete.
00:39:14
Speaker
By next Tuesday. Right. And I was like, this is what my schedule looks like. And I need to level out these tasks. What are the best times that I could slot to take care of these tasks? And chat did that for me.
00:39:30
Speaker
Yeah. Another thing that you can use it for for my parents out there, these kids bringing home that difficult homework. yeah Go to that bad boy in the chat, GPT, say, explain this to me so that I can explain it to my child. It will give you a step-by-step process on how to do that. It will actually teach you yeah so that you have the information and help you explain it to your child. So that's those are ways AI is here to help us.
00:39:59
Speaker
Yes, absolutely. It's here to help us. it's it And it's not like an alien that just crash landed on a planet like it's here to help us. We created it. Yeah. To make life easier and more efficient.
00:40:14
Speaker
To help you with these routine tasks that you so you can focus on bigger things so you can focus your creativity and your curiosity on bigger things. That's right.
00:40:25
Speaker
Well, looking ahead, this is the final question of this interview segment. Ladies gentlemen, y'all see me interview people all the time. This is first time my sister been on the other side of it. But looking ahead, what excites you most about the future of AI?
00:40:37
Speaker
And what concerns should people realistically keep in mind? Yeah. So the thing that excites me the most is something called agenic ai So that's, oh it it's a really fancy way of saying AI systems that go beyond answering simple questions and actually plan and carry out multi-step tasks autonomously. See, now nets now we get into the robots.
00:41:06
Speaker
That sounds like the robots. That sounds like Jarvis. It's really, is. Yeah, it's really, really cool. I mean, obviously, I feel like Jarvis had a little level of sentience a little bit. But, like, this is, that is, again, we're not there. So, like, don't have to worry about that. and gentlemen, Iron Man's AI in his suits. Yes, yes. Because some people, that's going to go over their head. But go ahead. yes. So instead of like asking the AI 10 separate questions, you may eventually be able to give it a goal and let it figure out the steps.
00:41:41
Speaker
m So instead of saying like, write me an email or organize my calendar or summarize, you know, my budget, you might be able to say, help me get my small business ready for tax season.
00:41:58
Speaker
And it can assist you across multiple tools. So not only is it able to plan and carry out tasks autonomously, it can use tools in order to do that. Like, it like for instance, if you're trying to get ready for tax season, it can pull down the the new tax code information and use a calculator in order to get, I don't know how taxes work.
00:42:25
Speaker
I use TurboTax. But you know, it can do it for you, right? so I'm sure TurboTax uses AI. It absolutely does. So, yeah. So, that that is the coolest thing that is here. there are a lot of you know, AI platforms that are introducing agents and a lot of people that are building agentic platforms. i I had a chance to build multi-agent chatbot and it was really cool. So the biggest concern though is over trust.
00:42:59
Speaker
Okay? You gotta to use some discernment. There are literally, if you go on any like AI site, under, down at the bottom, it'll say, like on ChatGPT, it says, ChatGPT can make mistakes.
00:43:17
Speaker
Check important info. AI can sound incredibly confident. And the answers be all the way wrong. So like it can reflect, like I said, it can reflect social bias.
00:43:33
Speaker
You know, if it's looking at incarceration rates and it's seeing the high number of black males being incarcerated for certain things, then it's going to start developing a bias there, right?
00:43:48
Speaker
It can hallucinate details that sound very real. And by hallucinate, that means it just makes stuff up. I have used it, especially if you're using it on the free tier, it will make stuff up, okay? It'll make up sources. There were those attorneys that got in trouble because it made up legal briefs that didn't exist, okay? Like it can make stuff up.
00:44:12
Speaker
ah So you have to have some discernment about what's being generated. When I interact with AI, it is an interaction.
00:44:23
Speaker
i will give it a prompt. It'll give me an output. And I'll say, I'm not so sure about this. I don't think that's right. Isn't this really the issue? And it'll come back. You're absolutely right. This is this is not going work because of this, this and this. Yeah. So it's you have to interact with it. Don't just take everything that you get from it at face value.
00:44:51
Speaker
The future isn't, you know, humans versus AI. It's about humans learning to work alongside AI responsibly with judgment and and and ethics and critical thinking, right?
00:45:10
Speaker
Right? Still firmly in the driver's seat. Just recognizing that we are in the driver's seat.
00:45:17
Speaker
Yeah. All right. Well, ladies and gentlemen, that is my sister, a recent graduate with a master's degree in data analytics and the JJ Jojo C. What was it again?
00:45:29
Speaker
Yes. It was a lot of stuff. The official is it's a master's of science of analytics. master she She's a master's science analytics person. Sure.
00:45:43
Speaker
ah So thank her for coming on the show and explaining AI and and and helping helping us learn because so many people, so many smart people out there being like, here it come. And I get it.
00:45:56
Speaker
Conspiracy theorists always. T2, we're about to get taken over. No. I mean, there's a whole film, television gen if ah book genre, sci-fi genre about AI taking over and destroying humanity. So like, I get it. When you've been force fed that and then you start seeing it become so integrated in your life, everything down to even a smart refrigerator that will look in your fridge and tell you what you need and create a shopping list for you. I'm waiting for that.
00:46:30
Speaker
It's already here. Goddamn. They got everything. But you know what, though? AI is not going to be the downfall of humanity. I got to end this on a dark note.
00:46:41
Speaker
but Okay. The downfall of humanity will always be humanity.

Reddit Post Advice: Controlling Parents

00:46:52
Speaker
Jay.
00:46:58
Speaker
o Went on my favorite website again. Why did I go on a favorite website? Because I wanted something that could get you riled up. Yeah. That i might get you a little agitated. That also would be kind of funny. And it was easy. And these Reddit posts are absolutely easy. Ladies and gentlemen, going to tell you all right now. The next show, the show before we go on our break, is all Reddit posts. That's all it is. So that's what it's going to be. But this one is...
00:47:27
Speaker
Am I the asshole for not wanting my parents name on my dog's identification chip? And just when I read that title, what is your initial response?
00:47:39
Speaker
Huh? Why is this an issue? It's your dog. Well, I'm to get into it. And it starts with, I'm a 17 year old female. okay Okay.
00:47:51
Speaker
Okay. I'm a 17-year-old female begged for a dog for eight years. My parents, 51-year-old male, 49-year-old female, always said no. This year, my grandmother surprised me and got me a dog specifically for me, not for my parents. Since I got her, I've paid for everything. Vets visits, vaccines, food, toys, grooming, collar, leash, training, and all supplies. My parents do not pay for her care at all. They sometimes pet her, but that's it.
00:48:19
Speaker
About half the time I'm around my parents, they are emotionally abusive and control is a big issue in our house. Now they're demanding that they're that their name on be put on the dog's Michael trip instead of my grandmother's.
00:48:33
Speaker
I want my grandmother's name on it because she bought the dog for me. I pay for everything. My parents don't want a dog in the first place and they aren't financially responsible. They told me that if I don't obey everything that they say without question, they will rehome my dog.
00:48:49
Speaker
I told them I will follow reasonable rules, but I won't agree to unreasonable control over a dog that they didn't pay for. we argued We argued about legal ownership, and I said from what I was told, the person who pays for the dog is the one on the record and the one that's on the records is usually the legal owner, which would be me and my grandma.
00:49:13
Speaker
I also said they legally can't just take my dog and give it away, and that if they did, the law would allow me to take them to small claims court. I wasn't threatening them, just explaining the law.
00:49:25
Speaker
My dad said he don't give a damn about the law and that he could do whatever he wants because he's the head of the household. During the argument, he stormed towards me and I put my hands up.
00:49:38
Speaker
And I put my hands up and he and backed up against the wall because I was scared. He screamed at me for being defensive. I said, I put my hands up because I don't trust him. And I thought he was going to hit me. That made him even angrier. And then he said, I swear to God, I will make your life a living hell if you don't listen to and obey every word I say.
00:49:58
Speaker
After that, he said he would rehome my dog if I didn't obey everything. Now he's also trying to force me to put the name on the microchip. I don't want to because if their name is on it, they could legally give my dog away and I have no rights, even though my grandma bought the dog for me and I pay for everything.
00:50:17
Speaker
For context, I have anxiety, panic attacks, and depression symptoms, but my parents won't take me to the doctor. My dog helps me feel safe. They have isolated me before by taking away my devices and not letting me leave or talk to anyone.
00:50:32
Speaker
They have threatened to do it again, including cutting off my internet, which I use for school. I feel like they're trying to take legal control of my dog to use her to control me.
00:50:43
Speaker
So am I the asshole for refusing to put my parents' name on my dog's identification ship and one of my grandma's name on it instead?
00:50:53
Speaker
So this is a situation where you have parents that say things like, my kid went no contact and I have no idea why.
00:51:07
Speaker
You know exactly why. You know exactly why. i feel so much for this for this young girl. It may... i And I don't want to, I really don't know the answer for this because the fact of the matter is she's underage. She lives in their household. she she there's not much There's not a whole lot that she can do.
00:51:34
Speaker
I honestly...
00:51:38
Speaker
um grand grandma put her in ah in a, I mean, get the sentiment, like you wanted to give her something that she would have been begging for, but you also got to understand the context in which she's living and like how much of a, how is that a cluster F that it would cause for her to have this dog?
00:52:11
Speaker
And it may be a situation where she, instead of giving her parents the satisfaction of having that power over that dog, of rehoming the dog until she, and temporarily until she can...
00:52:31
Speaker
get out, find her own place or find another living situation when she becomes an adult and can take her dog back. It might be a situation like that. Like, but there's really no good, ah you know, there's really no good
00:52:50
Speaker
answer because she is a child and they're legally responsible for her and they, they have legal rights. they have the rights in this situation and they will absolutely use that dog against her.
00:53:06
Speaker
It looks like it from what what she's describing. Yeah. Yeah. They're absolutely going to, and and I think it might just be a situation where you cannot allow them to take that, to have that leverage. And so if that means grandma has to hold that dog until you're able to get that dog back, you know, then that might be what it has to be.
00:53:28
Speaker
So you stole my idea. Grandma has to take the dog until I'm legally 18. Yeah. And then I got to find some way to get the hell out of the house. And and and that's i I feel for her because. Man, we were fortunate.
00:53:44
Speaker
This is what I mean by that. Our parents always said you can always come home. Yeah. Yeah. You can always go home. Now, nobody wants to live with their parents. And my parents were always very hip and young and gave us freedom.
00:54:02
Speaker
yeah I am the one of the three of us that's really just like, i don't care what freedoms you give me, I can't be in this house. yeah I don't know what's wrong with me, but I was just like, I need to be out there on my own. yeah But our parents were always like,
00:54:19
Speaker
you have a home here. yeah And if we had a dog and we were taking care of it, that it wasn't costing them anything, that that they didn't have to take care of the dog, that the dog wasn't messing up the house or anything, that the dog was careful, that the dog was out of the way from them, that made it even easier. Yeah. that this is like oh yeah That's your dog.
00:54:45
Speaker
Congratulations. This a situation where this young lady has this dog and is enjoying the dog but the dog is costing the parents money right she said i'm taking care everything and when she said everything everything because let me tell you something ladies and gentlemen a dog is not cheap no the training the the food the food good god the food dogs can't they they constantly have to eat i'm like roomaming The grooming, the vet visits. you Look, let me tell you something. was just fighting for my life yesterday, bathing Roni.
00:55:21
Speaker
this Look, let me tell you something. These vet visits, always three figures. Always three figures. Sometimes can go four. Standard. Sometimes can go five. Standard. All the time. Dogs are not cheap. And they're always doing something to hurt themselves to make you have to take them to the vet and be like, what you going cost me now? I had a friend that was just like,
00:55:47
Speaker
I was good. I was chilling. Finances was a breed. Then my dog tore at ACL. I was like, well, you got to get that fixed. Got to get the fix. Yeah. How much is that going run you? Almost 10,000.
00:55:58
Speaker
Good God. Yeah. Good God. Every time, every time I look, I was like, that's a bill. That's a bill. That's a bill. what you just see Oh, you got bit by a snake. That's a bill.
00:56:14
Speaker
And so I feel bad for this young lady that she doesn't have the safety and comfort. Yeah. That parents are supposed to provide for their children. The only good thing that I could say to this young lady is you got less than a year.
00:56:28
Speaker
Yeah. Life ain't going get no easier. You'll be better for it because though you're going to have to start behind the eight ball. It seems like you're a go-getter.
00:56:39
Speaker
Yeah. Because just to be able to afford to take care of that dog, odds are you're working right now. So that means you're a grinder. That means you take care of your business. So, yeah, no, you're not going to get the benefits that most people have where they get this catapult to start them into adult life. No.
00:56:56
Speaker
you're behind the eight ball but in 10 to 15 20 years you'll be better for it because you will have gone through all the trials and tribulations that means whatever hits you in your middle age you're gonna be like been here done yeah could get through it but yeah grandma did put you in a bad situation but grandma was just giving you some love grandma was also trying to piss off one of her kids Whether it was her son or her daughter, she was trying to piss off one of them kids and put you in a minute yeah in middle of it. So grandma can hold to the dog until you turn 18 and then you get the hell up out that house and, you know, do what

Conclusion: Embracing AI

00:57:36
Speaker
you got to do. And then if you really want to go no contact, you just, hey, just because it's your blood don't mean that you got to have a relationship with them.
00:57:45
Speaker
That's facts. Anyway, Jay, what do you want to tell these people out here? Man, ai is here for you. When it was created,
00:57:57
Speaker
to help you. You shouldn't fear things that are here to help you. The same reason, same way we don't drive a horse and carriage anymore unless you're Amish.
00:58:08
Speaker
is is And you can have a V8, V12, or electric car, you don't even need gas no It's just technology moving forward. And the people who are afraid of it are the people who don't want to upskill.
00:58:25
Speaker
And you will be left behind. Like, that's just the reality. If you don't change the way technology, society, culture is changing, this is not just, this is society, it's politics, it's culture. If you don't, if you're not forward thinking, you will be left behind. And that's just the reality.
00:58:46
Speaker
freedom em over fame Left behind with no behind. know. and dil stays the same don't know. Sometimes. Sometimes. i'm But on that note, ladies and gentlemen, I want to thank you for listening.
00:58:59
Speaker
I want to thank you for watching. And until next time, as always, I'll holla.
00:59:08
Speaker
That was a hell of a show. Thank you for rocking with us here on Unsolicited Perspectives with Bruce Anthony. Now, before you go, don't forget to follow, subscribe, like, comment, and share our podcast wherever you're listening or watching it to it. Pass it along to your friends. If you enjoy it, that means the people that you rock will will enjoy it also. So share the wealth, share the knowledge, share the noise.
00:59:31
Speaker
And for all those people that say, well, I don't have a YouTube. If you have a Gmail account, you have a YouTube. Subscribe to our YouTube channel where you can actually watch our video podcast and YouTube exclusive content.
00:59:43
Speaker
But the real party is on our Patreon page. After Hours Uncensored and Talking Straight-ish. After Hours Uncensored is another show with my sister. And once again, the key word there is uncensored. Those are exclusively on our Patreon page. jump on our website at unsolicitedperspective.com for all things us That's where you can get all of our audio, video, our blogs, and even buy our merch. And if you really feel generous and want to help us out, you can donate on our donations page.
01:00:11
Speaker
Donations go strictly to improving our software and hardware so we can keep giving you guys good content that you can clearly listened to and that you can clearly see. So any donation would be appreciative. Most importantly, I want to say thank you.
01:00:25
Speaker
Thank you. Thank you for listening and watching and supporting us. And I'll catch you next time. Audi 5000. Peace.