Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Episode 21 – The Future of AI & Why We Get the Ick image

Episode 21 – The Future of AI & Why We Get the Ick

The Shallow End
Avatar
25 Plays3 months ago

After some time away we’re back, refreshed, and ready to serve you up a platter of mediocre commentary on mediocre articles! Discussions include: breaking podcast ceilings, our future AI overlords, and Love Island lingo.

Articles:

https://www.forbes.com/sites/nicolesilver/2023/06/01/artificial-intelligence-series-1-of-5-past-present-and-future/

https://getpocket.com/explore/item/the-real-reason-the-ick-ruins-relationships?utm_source=pocket-newtab-en-us

Recommended
Transcript

Introduction and Humorous Beginnings

00:00:12
Speaker
Rebecca. Hi dad and hello to our shallow friends. Welcome to another episode of the shallow end. Welcome back. Again. Again. Back where we belong. Guess who's back. Back. Back. Tell a friend.
00:00:32
Speaker
What? Oh my. What is this? Look at that beat. Oh gosh.
00:00:43
Speaker
Oh my, you saw, wow. Hey Rebecca, you saw them just hand me that. That. Breaking news. Breaking news
00:01:01
Speaker
Well, this is exciting. I've just been handed apparently an urgent bulletin from the shallow end breaking news desk. Thank you. Did you catch that person's name? It's Tom. I thought it was Eric. Oh, it's Eric. OK. Something like that. Well, thank you for the bulletin. So this is important news that just broke across the the de news desk moments ago.

Podcast's Potential Success and Outreach

00:01:25
Speaker
Sources are reporting that the shallow end podcast is currently attempting to record an unprecedented 21st episode.
00:01:31
Speaker
If successful, experts are predicting this will catapult the shallow end into the top 1% of all podcasts globally. Whew. That's high. Now, let's temper expectations here. The impact of this surge into elite status is as yet unknown. But industry analysts are warning that there's nothing preventing the shallow end from further growth in popularity and cultural influence. Oh, wow. That's a lot.
00:01:58
Speaker
So the shallow and breaking news desk has reached out to President Biden, the National Oceanic and Atmospheric Administration, the NFL, the Academy of Motion Picture Arts and Sciences, Elon Musk, the producers of Book of Mormon, and Travis Big off big Dog Woof Woof McElroy for comment. So far we've received no response. We will provide updates as they become available and we now return you to your previously scheduled podcast.
00:02:27
Speaker
I thought maybe there was another one. Oh, okay. Thank you. That was pretty exciting. That was really exciting. My heart rate is heightened. Rebecca, when did we get a breaking news desk?

Team Dynamics and Comedy Plans

00:02:37
Speaker
Oh, that was yesterday. oh Okay. Did you hire Eric? Tom installed it. comment eric Eric brought you the... I hope Eric's an intern. He is an intern, unpaid. Got me nervous there. Did you like my little bit? I did. I liked it. Okay, cool. It reminded me of David. I worked on it.
00:02:55
Speaker
You typed it out. I worked on it, yeah. I'm really proud of you. Do you want to do a bit sometime? No. No, you don't? I'm kidding. I'll come up with a bit. I'll come up with a bit. Eventually. Okay, eventually. It'll be a surprise though, right? Maybe it'll be on our 101st episode. Oh gosh. That's going to be like my retirement party or something like that? Yeah. Yeah. And then I'll do a bit, like I'll write your eulogy.
00:03:24
Speaker
That sounds awful. Your death from comedy. You mean when I try comedy and just die? And you just die immediately. You just immediately pass away. It goes so against who you are. Yeah, we were in the middle of a conversation last night where there was a discussion of only having 30 more Christmases. Shut up. I know. That's what I said. Why are we even talking about this? You brought that up.
00:03:50
Speaker
um you know Dark forces. Dark forces, which conspire against us. Yes. All right. So let's do some shallow friends shout out. Okay. Do we have some? I don't have any. I just, I just would like to shout out to our super fan in North Carolina. Oh, shout out. Healing wishes and all our love. Shout out. We love you. You're a superstar. Yay. And I'm going to crack one open for you. Ready? Get it. Oh yeah. That one's to you.
00:04:22
Speaker
Cheers. I also had a little bit of a curiosity that I wanted to bring up.

Global Audience Speculation

00:04:27
Speaker
Okay. You know, digging deep into our analytics. Oh, you've been you've been checking those out. So either somebody is unnecessarily using a VPN or we have ah we have shallow friends in Germany and Spain. No way! Regularly downloading every episode. I think there's a third possibility.
00:04:54
Speaker
I think we have a global audience. Welcome to our French... Hello! Hello! Wait, what is your... What is your explanation? That they could be lying to us. Oh shush, you mean the numbers aren't accurate? The numbers could not be accurate. Remember when we first started and we got like 50 downloads on our first episode and we were like, wow, it's happening, but no. Oh no, we got 500.
00:05:23
Speaker
I don't remember that. Yeah. and we got It was very weird. Oh, and they were all at like 3 a.m. weren't they? Yeah. I don't know. Well, I'm just happy to see Germany and Spain lighting up every every week. Thank you to Helga and Pierre. Oh, wow. Look at you. That's our Welcome to the friends welcome welcome in welcome in and beyond okay to our friends from across the seas.
00:05:58
Speaker
yeah
00:06:00
Speaker
All right. Well, we are an international phenomenon. We are an international top one percent phenomena. Yeah. And I will be adding that to our Spotify description. Sweet. International phenomena. One top phenomenon. phenomenon Phenomenal. Phenomenal. All right.
00:06:21
Speaker
All right. Well, um, this week we have your regularly scheduled programming back.

AI Discussions and Impact

00:06:27
Speaker
This is a real episode. The other one was a real episode. It was a chitchat episode. It's okay. Those are real. Those are real ish. Those feel like half episodes. Okay.
00:06:37
Speaker
Anyway. um And I think we're going to start with Dad. Yeah, I threw you a doozy. You did. A real doozy. A real doozy, which I put off until two hours ago. Yeah, cool. Well, I woke up at 6.30 and read through both the one I sent you and yours. Yeah. They're pretty starkly different. Yes, very different. And let's just put it this way. We're going to summarize.
00:07:03
Speaker
and and
00:07:06
Speaker
Work our way through the mine and Rebecca's will be much much more fun. Yeah, but we're gonna have fun with this one, too. Okay? Okay. Okay. So I sent Rebecca um an article from Forbes. It was actually a series of articles and and I hacked it. But um by somebody named Nicole Serena Silver.
00:07:33
Speaker
And it's about AI, artificial intelligence, the history and the future. And I have two things right off the bat that I'd love your feedback on, Rebecca. Okay. Number one, um, she starts this whole series off by quoting herself. That's a power. That's, that's a real interesting decision. And number two, in general,
00:07:57
Speaker
I'm pretty sure AI wrote this article. I wrote that down. I was like, this sounds like an AI-generated article. For sure. Absolutely. It's bad. It's pretty bad. I started out noting every time there was a typo, and I gave up because there weren't too many. Again, Forbes. Forbes. Fairly well-respected publication. And she got multiple.
00:08:22
Speaker
parts of this published? which Were they all published at once? I don't know. It was a five-part series. This is from June of 2023. I would not recommend digging um digging this one up. Don't read this. It's pretty stupid. It's bad. um Really, the whole thing, I gave you a 15-page article because I wanted to ask you whether AI is being used in your lab.
00:08:49
Speaker
But this is a really difficult way to get there. um Well, I do have an answer for that. OK, well, let's wait until it pops up in that in the article. OK. All right. So OK, so you agree this is probably an AI generator. Oh, yeah. ah hu Like there are multiple sentences back to back to back like that are.
00:09:12
Speaker
sound like they're taken from multiple different places and just shoved into one paragraph. It's a hard read, honestly. It's tough. And it also does this weird thing where it bounces back and forth between being this objective, historical kind of thing, and then just throwing wildly subjective things in. Yes!
00:09:36
Speaker
like she has a whole part where she talks about the future that could happen like the utopia and dystopia that of of ai that it sounds like she's just like writing a science fiction novel right or what are we doing yeah yeah all right i do want to point out that there was an interesting um uh segment that where she's she's referencing a book called the techno human condition and she talks about it's the butterfly effect of new technology and she gave this example of the train system and how when they when trains became ubiquitous they that one of the challenges was time and they had to like create universal time
00:10:24
Speaker
and so So you're building a good foundation, now work on that, and then she just never really goes back to it. Well, the thing is, it's not even her example. No, it isn't. It's an example from the book. She didn't even come up with that. She just didn't even talk about it, just mentioned it once and then moved on. It felt disappointing, I think, mainly because the surface level of this topic is interesting. it So I was interested and then I wanted her to dive deeper and she just stayed. married it It very much felt like, I don't know. yeah It felt like it was written by a high schooler. I was just going to say like a high schooler turned this in for a grade and got an 80%. I'm not going to give you a C, but
00:11:14
Speaker
It's not good. It's not good. I also like she tries to like smoosh AI origins back to like the 1950s like girlfriend. No, that's not artificial intelligence. That's just computer programming. Yeah.
00:11:31
Speaker
I mean, there's a lot of weird stuff in here, but um right I'm going to skip to some of the interesting things instead of just going through it play by play. I think that's a good idea. um So one of she references modern day AI systems in medical settings.
00:11:48
Speaker
Which makes sense to me, and I've heard the statistics. She doesn't even mention it, but I've heard the statistics that um like there now determine that AI systems read x-rays better than humans. I believe that. like AI can replace radiologists right now.
00:12:11
Speaker
Um, so yeah, but that's not in here obviously, but the specific thing with your work, uh, was things like drug development and AI advancing drug development at a much faster pace. What, what say you that's actually funny because that's, that's pretty much the topic of the research paper that we're working on is like how to use machine learning potentials to investigate molecular interactions out of like we know that they're faster but let's see how accurate they are compared to the like tested tried and true methods that people have been using for years and that's what our paper is about is like comparing
00:13:01
Speaker
the different machine learning potential methods and the results that they get against like the quote unquote truth. Yeah. And seeing what like error variance there is in which one because they're so fast. Yeah. But but AI also makes shit up. Right. So you got to make sure that it's doing it right. We're using machine learning not AI. Yeah that's true. Which is technically different. Right. But in the same vein. So I will also say
00:13:30
Speaker
My job that I've been given for this paper is a lot of coding, which I don't know a lot about. I'm not very proficient in, so I have been using, don't tell, just kidding. They know. I've been using chat GPT to help edit a lot of my code, so I have been using AI. Okay. Look at you. Yeah. Later on in this article, you can find out that you're you're now training yourself for the new workforce. Yes!
00:13:58
Speaker
That means I'm going to have, um um it said something like those that adapted ended up having like a higher level job in the company following the AI integration than they did previously. So look out, Dr. Gare. Yeah, you've taken over. I'm coming for your spot.
00:14:20
Speaker
Okay. So, all right. So, a I just read this. This is so good. Okay. Word for word. Here we go. This is, here we go. This is short paragraph. This is the kind of silliness that is in this article. There's like, again, it's like at least 15 pages. Oh, it's long. Yeah. Forbes. Remember people Forbes chat. GPT is about to blow the roof of educational institutes and the educational industry is not prepared.
00:14:52
Speaker
Blow the, that's what it says, blow the roof of. The education industry is slow to adapt and full of bureaucracies. However, text generating applications will forever shift education. Innovation will be needed sooner than later.
00:15:11
Speaker
to prepare this generation for a new wave of education and for job market shifts. God, she really sticks her neck out there. Word salad has come to us.
00:15:22
Speaker
right so
00:15:25
Speaker
you know It's just crazy. It's so bad. I know. I'm sorry. I apologize. I'll never do it again. Mine isn't like any feat of words either. But it's short and in digestible and... Makes a point. Interesting. We'll just keep teasing it to keep people interested in this. Keep listening! Monstrosity. So AI in the workforce, yeah. um like it's it just kind of repeats the same stuff like there's nothing new here at least no there's nothing new in this article fears about ai and its impact on jobs have been exacerbated by all of the layoffs oh all of them all of them you know the layoffs you've heard of them you know johnny down at the factory he got like done got laid off the shoe factory yeah a robot came in and stole his job a robot
00:16:16
Speaker
Oh my gosh. Okay, hang on. This is one in that same paragraph that I yeah i underlined, and this is just, this is my my note. Question mark, question mark, question mark. Because it's another typo. Yeah. Where she says, or Chachipiti says, it is important to reflect on what our core values. Taking into consideration both how we treat people as well as how we stay competitive. All right, hold on, hold on. Give me a minute.
00:16:45
Speaker
I'm reflecting. Are you reflecting on your core and what it values? My core values. What is it values? Mine values. Shorter articles. Mine values donuts. Donuts, okay. Your core values donuts. Yeah. Goldman Sachs says that 300 million jobs could be impacted by AI. At some point. At some point. By when? Who knows. and She also says it is predicted.
00:17:10
Speaker
Oh, I guess by Goldman Sachs. OK, well, we're OK there. I do. There's one glimmer of good news in this whole damn thing. What? um The most at risk industries. Legal.
00:17:25
Speaker
44% of all legal jobs could be eliminated by AI. It's going to be a robot. fewer robot or Fewer lawyers would be great. I'll take the robots over the lawyers. What a utopia. Yeah. um Also, speaking of robots, she then goes on to say the least at-risk roles are manual labor and hands-on jobs. Wrong. That seems crazy to me. AI-powered robots are going to do the manual labor without ever getting tired.
00:17:54
Speaker
Uh, Dr. Gare always says the good thing about robots is they don't get tired, which, you know, that's why, you know, you like to work with brilliant people. Does make me think that that is going to be like something that the robots use against us when they take over. Lazy meat sacks.
00:18:11
Speaker
The best thing about robots is we don't get tired. And then they like kill you. Chasing you. Yeah. Scary. Okay, you're skipping ahead to the dystopian view. I skipped ahead to the science fiction movie that she wrote at the end here. But it's, I mean, honestly, Dad, I don't have a lot to say. Okay. Because there's not a lot to work on. I know. But she does try to give us a new topic. She decides that this is a great opportunity to bring up universal basic income.
00:18:39
Speaker
So now let's talk about that let's talk about new thing. She does. She does take us. No, she doesn't. She just brings up that it is one solution. She parrots it. But. OK.

AI's Influence on Jobs and Society

00:18:53
Speaker
I did bracket the last sentence in this in this paragraph. One of the important pieces to universal basic income is making sure that people connect to their passion and purpose to live meaningful and engaged lives. And I said,
00:19:08
Speaker
I mean, sure, but that feels like a downstream problem. It sure is. What are we talking about? It doesn't even exist. so
00:19:21
Speaker
And when what does it have to do with the future of AI? What does it have to do with the future of AI? Also, I would argue, what does it have to do with universal basic income? Like like what if people don't have to work to survive, they don't have meaningful lives. Right. There is a like a psychological concern that people won't work if they don't have to.
00:19:42
Speaker
Well, that people will lose purpose, and will lose a sense of purpose and devolve into addictive behaviors and you know just play video games and stuff like that. and But I think that that's a mischaracterization of what universal basic income is. Nobody's saying everybody gets 100 grand. They're saying, like let's make sure people aren't living in their cars. I also think it's a mischaracterization of humans. Probably. like
00:20:11
Speaker
I don't know the Kardashians. would This is not based on any actual data or anything. I'm just talking out of my ass. But isn't there like a thing? Wow, I sound like her. That people who retire are always doing a lot of other stuff because they can't handle not working.
00:20:32
Speaker
there's also the flip side of it where a lot of people die within two years of retiring because they don't have any purpose anymore. So there's, it's sad yeah, but it's, it, there is, it's a legitimate thing to talk about, but I don't think, but yeah, I don't think universal basic income is coming in a, in a form that just removes work entirely. Everybody gets to sit at home. Yeah. Um,
00:21:01
Speaker
Yeah. Okay. So skipping ahead to, she takes a wild stab at ah potential replaceable tasks of, ah you know, AI is going to replace these, these roles. And some of them are just like whack. Yeah.
00:21:22
Speaker
Like, smart home automation? Well, wait, we we already have that. Anybody who wants to put home automation in their house can already do that. What what do we what is she trying to say she's solving? huh it's It's also great that she just like...
00:21:39
Speaker
below is a list of potential replaceable tasks. And then she just has like bullet points. I feel like she just sat down and was like, I don't know, fucking yeah um fraud detection in finance. right Like I just like came up with this stuff in like 10 minutes and then was like, all right, that's good enough.
00:21:55
Speaker
Like, it just no research done at all. I agree. I agree. um I mean, online dating is in there. Wait, did you see the gaming one? Where's gaming? Oh, it's here it is. E-sports coaching. Love that. Here's my E-sports coach.
00:22:15
Speaker
Um, I also had, I did chuckle out loud when I got to the education section and plagiarism detection was in there. Cause I'm like, all right, did you run this one through? Does it? How much white space did you see between all that red that popped up? Yeah. Whoops. None. Also, if you had chat GPT, write an essay for you and then somebody the teacher maybe, sent it back through an AI detection system. Would it be like, I ain't snitching? Snitches get stitches, bro. Yeah, they might shut me off. No, that's all humans right there. They did that all by themselves. Chat GPT is not the reliable source for plagiarism detection. It got custom tools for that. Yes. They had that when one I was in high school.
00:23:14
Speaker
like we had to Like submit, like turnitin.com, like you submit your essay through there and then it like sees what's plagiarized. So they've had, these are already replaced. Oh yeah, yeah, for sure. I thought about our good friend Matt Baxter when under the human resources section and it said that AI could be used for candidate screening.
00:23:38
Speaker
Uh-oh. Uh-oh. And I immediately was like, wait, didn't didn't we already discover that AI sucks at identifying?
00:23:48
Speaker
like racial like yeah faces of well of people who aren't white that's actually that is i mean totally enough something that we addressed in my um like coding like modeling class um we went over biases biases in algorithms and we watched a ted talk by one a black woman who is like investigating that phenomena non did
00:24:19
Speaker
And it's because the people that made the algorithm were all white, only provided white faces while training the AI. and so yeah So these are things that we already suck at as as as meat sacks. Yeah. And now we're going to just lock that in. Transferring that to way more permanent. Sounds pretty sketchy to me. I don't see any problems with that. No, not at all. yeah Not at all. um That could only go well. Public safety. Don't love that, guys. Yeah, surveillance and security. Don't love it. Couldn't go wrong at all. Yeah. Crowd control?
00:25:00
Speaker
ah Yeah but You bet thank you probably with multiple guns how about crime prediction? Yeah, what the hell is that crime prediction? I predict that you may steal something. I'm gonna have to detain you you look like you're yeah so Yeah, these all seem like very slippery slopes here either things that already exist or
00:25:27
Speaker
Slippery slope. Yeah. dystop I think that this is just a woman who watched too many superhero movies. But um okay. So the next one I had lots of lots of thoughts about art and design because I personally think that art and design are going to be one of the last elements that I can actually replicate. Replicate human. Well, have you been on Facebook recently?
00:25:54
Speaker
Well, yeah. I mean, yeah. Of course. I'm 51 years old. Yeah. It's like where you live. It's where I live. Yeah. You live there. um Well, I pop into Facebook every now and then and I don't have enough friends on Facebook to fill up my feed. Oh, cool. So Facebook gives me just random stuff to look at. And sometimes I pop into those comment sections and see what's up.
00:26:19
Speaker
There are a lot of old people on Facebook who cannot tell the difference between a real photograph and an AI-generated photograph. Yes. and count the fingers even count the arms guys this one has three arms this baby has three arms and the mother has one what's up so I don't know if I mean this is what I'm saying I think I think we're safe on the art side you know and and it goes on it says music composition as well I'm like yeah yeah if I want my world to sound like a Nintendo game AI is gonna do a great job
00:26:54
Speaker
ah The hope is that it won't be able to replicate those sorts of human emotions, right? But ooh, you want to hear a sad story that I learned on tick-tock? Okay of a mother do I whose 14 year old son killed himself because he was talking to an AI I he did hear this and it told him to kill himself And so he shot himself That's crazy that the AI could do that. Yeah, so. And scary. Who gets to go to jail for that? Yeah. Nobody, I don't think. Not good. Although the gun was like just out. So. Yeah, I mean, there's obviously there's more than one thing going

Ethical Implications of AI

00:27:35
Speaker
wrong in that second. Yeah, there's a lot going on. But apparently he was like, the AI was getting like sexy with him in the chat. Yeah, he was having like a romantic and sexual relationship with this AI.
00:27:48
Speaker
Who then then told him to kill himself to be with her. I should say that then told him not who. Uh-oh. He's already got him. Yeah. Okay. So it's just sad and scary. Yeah, that is sad. Let's unplug the AI. But if we don't... Have you seen the Age of Ultron?
00:28:09
Speaker
No, no. It's an Avengers movie where Iron Man's AI assistant, yeah he builds him a body and then Ultron gains sentience and is like, oh, I'm stronger than everybody here. You're all going to die. And then they have to defeat Ultron. Ooh.
00:28:32
Speaker
Well, Tony, you should have thought about that. It's already out there, guys. Why are we acting like we don't know what's gonna happen?
00:28:40
Speaker
Um, yeah. Well, I mean, you better start talking nice about it because this is going out into the universe. I do say thank you to Siri and I say thank you to chat GPT. When, when I, when I send chat GPT, the errors from my code and I say, what's up? And it says, this is what's going wrong. Here are some solutions. I tell it which one worked. And then I say, thank you. I say, Oh, thank you so much for your help. providing Such valuable feedback.
00:29:06
Speaker
Yeah. And making a friend. up To be safe. Yes. For self-preservation. Uh-huh. Yeah. Yeah. Maybe you'll get a little bit of an extra ration of food. Yes. When they hook you up. I do. Also, when I pull up chat GPT and I'm around other people, I'll be like, one second, I got to talk to my best friend real quick. Nobody ever laughs. So thank you. That's generous. Okay. Yeah. Let's talk about new jobs that will be generated.
00:29:35
Speaker
This is something that I- Oh, you got one? I scrolled through. The personalized assistants. Oh, yeah. We know how transformative this kind of digital companion would be. Just look at Jarvis from Iron Man. Jarvis was turned into Ultron. Whoops. That's a bad reference right there. They didn't mention that in the article. Look, it's helpful. Yeah.
00:30:02
Speaker
So ridiculous. OK, anyway. Okay, so now we're going to get to the these are the new jobs that might be created by the evolution of AI data science.
00:30:15
Speaker
Really? That's new. That's a brand new thing. We don't have that. I've never heard of it before. Okay. Nice try. How about cybersecurity? What is that? It's just been a wide open internet till AI came along. We don't care about security.
00:30:30
Speaker
oh but bunch of stuff Okay. But then she also adds like new jobs that are just like not created by, I guess they are created by AI, but they're just like AI associated, right? You know, which are like,
00:30:45
Speaker
Like one of them is AI ethics. And it's like, well, that wouldn't, you know. How many people are going to have that job? Yeah. Just the one. Nobody eventually. We just have the one. Yeah. AI will have that job. Don't worry. We've reviewed ourselves and we find ourselves to be ethical.
00:31:04
Speaker
After a thorough review. Don't ask questions. um
00:31:11
Speaker
There's one in here that i that is, I think, actually new, Prompt Engineer. And that our our local university that you used to ah yeah attend, um they actually have like a six-week course to learn how to properly prompt AI to get what you want. wow And I'm like, oh, that's kind of cool. But it's also a little bit like,
00:31:42
Speaker
like childish to me. Like you really think that there's going to be a permanent role for the prompt engineer? No. No. AI is going to get so smart that you're going to converse with it. yeah Tell it what you need and it's going to figure it out. This feels like a job in hunting them. Just a little skill to learn during the baby stages right of AI. Yeah, that is going to go away. The PhD student that I work with told me this week that I'm good at getting AI to give us usable trumps of code, and he's not. So I guess that's already my job. Prompt generator. I mean, no you no, you are a prompt engineer. Oh, an engineer, my bad. I initially read prompt engineer as a very on time engineer.
00:32:32
Speaker
Because A.I. helped him out with that one. A.I. was running the engineer's calendar and yeah. Okay. email
00:32:44
Speaker
okay There's one again. She's using somebody else's words, but there is this one little um quote from Nicole Bradford.
00:32:58
Speaker
Yes, I highlighted that too. In a future of work series?

Future of AI: Optimism vs. Pessimism

00:33:02
Speaker
And the quote is, we're on the cusp of an extraordinary renaissance of human possibility and abundance. Young people today will inherit and build their own technologies that could eliminate poverty, inequality, hunger, illness, and even death. Like, yeah, there's there's the optimistic side. Let's do that. Yeah. Oh, sign me up now. Let's do that. OK. You on board?
00:33:27
Speaker
so oh come on I mean i'm on yes if that were true I don't think that that's true though okay I think that people are so charged either way on AI and some of the smartest people I've ever met are just like don't think about it. They're like, whatever. It's a tool for me to use for this thing that I am doing that I know that couldn't do. Like this AI can't do what I do, you know? I'm not speaking as myself. Speaking as the smart people that I know. yeah Because it can't. And so they're just like, I'll use it when I can. But I also know that I have to use it as a tool and not as a teacher. Like, I feel like so many people go to chat GPT to just like,
00:34:17
Speaker
to supposedly learn. Yeah, like, do this thing for me. Yeah. And then they just take that and put it somewhere else without any review. And like, like, it's, it's a starting point, but you have to actually know what you're doing. You definitely do, especially if you're using it to generate like communication of some form, like or like an article for Forbes. Right. Yeah. I mean, if you don't actually apply, yeah there's the the the uncanny valley, right? Like people have an understanding, like you can instinctively tell, you this this is not a person that wrote this. There's some things in there that are kind of like, like I said, like freshmen in high school kind of things.
00:35:02
Speaker
Yeah, um, yeah, you got to review your stuff kids. Don't rely on don't just copy paste. Come on, at least put it into another AI to edit it very ooh just a recursive AI recursive AI is all the way layers upon AI all the way down is good.
00:35:25
Speaker
Oops, just AI. Just AI. Okay, let's let's come on. Let's let's get this thing. Oh, here's here's a good one. Okay, all right. Okay, so she gets into the reasons for regulation and a dystopian future and leads that one off with a with a quote.
00:35:47
Speaker
from the CEO of OpenAI, Sam Altman. He has described superhuman machine intelligence as quote, probably the greatest threat to the continued existence of humanity. Well, cool, bro. Thanks for thanks. for making Thanks for like putting the pedal all the way to the floor on it. That is something I find funny is how many people who have created different AI machines later come out and early are like, this is a terrible idea. yeah We have to stop now immediately. Pull the plug. It's bad. It's like, but this is a guy that's, he didn't pull the plug. He's just dropping nuggets like that and then going, but volsby and current ceo yes you be well, once he was once in future CEO, right? They kicked him out and then he came back. Oh yeah.
00:36:38
Speaker
Well, I have heard about this, the Jeffrey Hinton guy, yeah yeah who yeah talked a lot about it. Yeah, yeah give us give us that quote. Okay, this is what he has to say. These things will have learned from us by reading all the novels there ever were and everything Machiavelli ever wrote.
00:36:58
Speaker
how to manipulate people, right? And if they're much smarter than us, they'll be very good at manipulating us. And so even if they can't directly pull levers, they can certainly get us to pull levers. It turns out if you can manipulate people, you can invade a building in Washington without ever going there yourself. Ooh, that's spicy. Oh, it's began. Why did we get into politics? Those were peaceful protesters. They were very fine people on both sides.
00:37:25
Speaker
such a real and it listening i I Stopped the highlighting right before that. I was not gonna read out that last line. Well, you did he said it so i did I don't know. I think it's kind of a zinger. Yeah. Yeah, sure is It's it's tough because my opinion on AI is similar to my opinion on most AI
00:37:52
Speaker
interesting things that humans have access to, which is cool in theory, but humans will properly probably ruin it because that's what we do. At one point she says she says, she's like, I don't remember, I don't want to find the quote, but she says something like, um the future of AI will probably fall somewhere in the middle of utopia and dystopia and it depends on like if The actor with the tool is a good actor or a bad actor that determines whether the tool itself does good or does evil. It's not like the tool itself doesn't have morality. And I'm like, yeah, OK, I agree with that. But like since when have humans gotten a really big powerful tool and not fucked shit up with it? So A.I. is a gun.
00:38:47
Speaker
Are you saying guns don't kill people, people kill people? Well, I'm just saying, like, it's a gun. It's designed in a certain way, can be used for good, but probably is a weapon in the hands of bad people. Yes.
00:39:07
Speaker
and probably will lead to a lot of unhappiness and death and destruction. Awesome. So thanks for listening. Shout out to our robot overlords. We love you. We love you. Please, can we have an extra ration this week? I just, I wrapped this whole article up by just saying, okay, I guess we're going into a future that's going to look like The Matrix meets Wally.

Personal Lifestyle Experiments

00:39:37
Speaker
Yes.
00:39:39
Speaker
Just give me my Slurpee, I'll be fine. Although, if I could have one of the, never mind. No. I was going to say, if I could have one of those hovering chairs and then I realized wheelchairs exist. They don't hover. They don't hover, but it's pretty much a survey. Yeah. Well, that was fun. God, that was 40 minutes of you were like pulling team. You were like, let's do this quickly. it No. We like to chat. Yeah, we do.
00:40:08
Speaker
Okay, so a little halftime, a little break. and get lets Let's talk about what we're drinking. If you're still listening along, go grab yourself another drink. Please. Because you've been very patient with us. Thank you for enduring with us. We're just grinding out episode 21 right now. Hell yeah. Dad, do you want to tell them what you're drinking? Because it's a little different.
00:40:37
Speaker
Why? Because it's bright orange? That's one of the reasons. I am drinking a delicious non-alcoholic Aperol spritz. Well, what's interesting? Why don't you tell them why? That is. Why I'm drinking a non-alcoholic Aperol spritz. Why in the world would you ever drink that? Because I like Aperol spritz. I think that's not true. Oh, you're right. It's not. I am taking 28 days off of alcohol. That's so specific.
00:41:05
Speaker
Yeah, it was just because I wrote it down on the calendar and decided I wanted to start when we got back from Kansas City and then I wanted to take four days off during Thanksgiving week. Thank God. And then I wanted to be done the day of the company Christmas party. Oh. And so when I did the math, it was 28 days. That's interesting. Okay.
00:41:29
Speaker
but I'm just doing it to see how it impacts my overall health and fitness because I've been working on health and fitness this year and I figured I should try it. Yeah and how how long have you been doing this now?
00:41:45
Speaker
This is day 12 I mean the only thing I think I notice is um I Think I have more energy in the morning, um but not by a lot is it worth Put it this way it's it's an experiment that I am happy to be doing and I is not as hard as I thought it would be but it's an experiment that will end. Yes. That's kind of what I recently took a break from weed for a month. And it was the exact same vibe for me. I was like, I wanted to do it to see if I could and to see how I felt. Right. And while I was in it, I was like, yeah, this is a different experience. Right. But after this month. I'm going to get me some weed. I'm getting back into it, man.
00:42:40
Speaker
like I hear so many stories of people being like, it changed my life. Like, I feel just so different. Like a completely different person. And everything is better now. And I'm like, no. It's how I felt after my hysterectomy. Everything is better now. But without weed, I was just a little bored. Yeah. Well, so speaking of that, I mean, like one of the things about Drinking is it's very frequently a social thing and I in talking with other people who have gone through like well I'm doing you know and sober or October or whatever
00:43:21
Speaker
If you go through that and you then say, well, I can't go out because I don't want to be tempted by alcohol and you just sit at home and do nothing, that's going to, you're going to be miserable. yeah Yes, you're going to be miserable. Also, that's not a comparable, like like, no, if you're not, obviously it's not as good. not Exactly. hanging out So my version of this has been, I'm going to do the same things I always do. I'm just not going to be drinking.
00:43:46
Speaker
Yeah, we're we're planning on going out to one of our favorite bars after this. And I had the realization when I got to your house and I was ru rummaging through your fridge, yeah like a goblin. And I saw non-alcoholic wines in there. I was like, oh yeah, dad's not drinking. And I was like.
00:44:01
Speaker
What the fuck are we gonna do tonight? It's not drinking. But you're just gonna go and we're gonna sit there. Yeah, it's for the vibes. The good thing I think is I've learned, I can still enjoy the fun. It's still fun. It's not dependent on the alcohol, it's dependent on the location. And the people. Interesting. It's been cool. Cool, cool experiment. I enjoy it. My science brain is...
00:44:27
Speaker
so and what are you drinking oh i'm drinking i'm i'm sure you've talked about one of these i think so before but i'm drinking a breeze these are the little um cans of speaking of weed yeah cans of thc and some cbd and lion's mane mushroom which is not a hallucinogenic mushroom at all it's just a regular it's supposed to elevate mood well Supposedly, you know antidepressants didn't quite work for me. So I don't know if lion's mane mushroom is gonna do it Maybe a more natural thing is gonna make more sense. I I did I have noticed that especially without
00:45:07
Speaker
having alcohol in my system. A couple days ago I just had an awful awful day and got home and as I was getting ready to go to bed I crack open a breeze and I pour it over ice and I start sipping and like once it hit me I was like now I can actually speak to people. oh So we did it definitely elevated my mood and chilled me out. I'd like to attribute that to the weed more so that Oh, I have much i would agree. Yeah. Yeah, but Yes, they're nice. I like them side note Corbin from Breeze We're available to sponsor your product. We talk about your product all the time. Just hit me up, bro We will continue to talk about it and if you give it to us for free, but also probably if you don't
00:45:56
Speaker
Yeah, I mean, we're still gonna talk about it, so whatever. Well. Alright, I'd like to talk about something very British now.

Exploration of 'The Ick'

00:46:05
Speaker
No, it's so British. Oh my gosh, this article. Okay, guys. Tell me all about it, Rebecca. Okay, so the article that I chose, honestly, it was yesterday and I realized oh I still haven't found an article so I genuinely just opened a new tab on my browser and looked at the suggested articles perfect and this was one of them great so this one is called the real reason the ick ruins relationships
00:46:40
Speaker
Excellent. and um I don't even know what this website is called. It's called Pocket. Okay. Sure. It's written by Lottie Jeffs. Yeah, all name team right there. That's almost as good as all those southern names we had. Oh my gosh. Maggie was texting me today because she listened to, oh my gosh, she listened to that episode today for the first time and she was texting me her reactions throughout. First of all, she thought it was hilarious. Great. But she also just texted Kitrin with like 10 question marks because... Yup! That's a crazy, crazy name. Yeah. Anyway. Have you heard of the Ick? Well, that's an interesting question, Rebecca. I have not. Have you not? Okay, so you hadn't before you read this. I was hoping that you hadn't. Yes. And I was hoping that this would be your introduction. What do you think? Is this how the kids talk? This is how the youth talk these days.
00:47:37
Speaker
What do you think about it? I thought well, okay, so Overarching opinion of this article. yeah I thought it um it was very Woman-centered. Yes, it was which is fine. I'm just it was very woman-centered. It was um I thought very balanced there was a lot of a Lot of giving like exploring all the different kind of components or or reasoning or anything for for the ick. And then there was just a bunch of stuff that made like, it's just terminology and phrases and things like that that make no sense to me. Really? Because like, we have references to Love Island in here. Oh, like I don't know this shit. I didn't know that that's what it came from either. I had only heard it. I think just
00:48:35
Speaker
in passing on TikTok. And I i just immediately understood what they wrote. Right. Okay. Well, I did it. I've been there. I know that. Yeah. Like one. Okay. So it used to, it was like a trend on TikTok where people would be like, if you have that boy that you can't get over, but you know, his trash.
00:48:56
Speaker
Here's a list of X for you to imagine him doing but so you can get over. So you can so like one of the examples was imagine him chasing after a ping pong ball.
00:49:08
Speaker
that's That's bad. That's pretty rough. When I realized I was in love was when I started imagining the person doing those things. And I just kept going, aw. The icks didn't work on me. Yeah. I was going to say, chasing the ping pong ball and be like, you just start giggling. Like, what are you doing? You look ridiculous. But why are you on all fours? What are you doing? Just get a new ball. Just leave it. Leave it. It's under the table. Go get it after the game. But OK.
00:49:38
Speaker
Can I just point out, though, yeah this whole stupid thing starts with from smelly breath to bad manners. It's like those that's not the same thing. That's the thing. There was a long debate on TikTok between a red flag and an ick. People were people were blurring that border. Yeah. Smelly breath and bad manners are these just completely controllable things that you choose. Those are just red flags. Normal relationship red flags. Just don't.
00:50:03
Speaker
you just don't like that person right the ick is specific to me at least in that you did like them and then they did one thing and you just can't explain why like this switch gets flipped turned you off too like i just can't see you that way anymore yeah can i tell you one that happened to me sure it's actually this one no it's not red flag okay we were sitting on the edge of his bed and he didn't have socks on And one of his toenails was like, fungus. And I was like, looking at his toenail and I was like, Oh, I can never stop.
00:50:45
Speaker
over And we were having a pretty serious conversation at the time and I just like couldn't clue back in. It really shook me out of the whole experience. Here you go. i mean there There was another reference to toenails in this article. Oh, there's a reference? Oh, yeah. That was when she brought that up casually. She said whether it's they're chewing sharing their toenails. Chewing their toenails? Hey, bonus points for flexibility, though. That's an ick. Just saying.
00:51:13
Speaker
a flexible man is an ick. Just in general, flexible man, okay. You have to be rigid. um Yeah, okay. So the so these things, ah for for those folks that are listening that don't know what the ick is, they probably have gotten it by context clues at this point, right? But give me your definition of the ick. The ick is something that isn't an inherently bad thing, but When you see somebody do it, you just, it changes the way you see them in a fundamental way and an irreversible way, I would also argue. Like it's just, oh, I don't hate you, but I'm not attracted to you anymore. It's like one of those, you know? Yeah. So in not having that in my lexicon before going into this article, I did find it very easy to pick up like, Oh, I get it. Like that makes sense to me.
00:52:11
Speaker
Um, but I also think that some of the examples that were given in this article are not, i like some of these are like, look, there's even just starting with where did the it come from and the reference to friends. They bring up Monica sleeping with a 17 year old and she feels the, i it's like, no, no, no, that's shame. that's She feels bad because she feels bad he assaulted a child by accident. like I don't think that that really fits in with the modern definition of the ick. That was a weird thing to drop in. I think they were trying to say that the word ick has been in the cultural zeitgeist, but that's definitely not right what we're talking about here.
00:52:58
Speaker
um also I find it funny that she references something that Sophie Turner said. About Joe Jonas. that joe jonas who At the time this was written, I assume.
00:53:11
Speaker
they were still together because they're divorced now, I think. okay Anyway, but she says he insisted that he's often told he looks like a young George Clooney and it gave her the ick, which I think is hilarious. That's somebody's, although that is kind of a subtle brag, isn't it? Especially people. It's a subtle brag, but it's also, she's probably like now jumping forward and going, I can't picture myself with George Clooney, that's awful.
00:53:41
Speaker
would be Oh my gosh, okay. So something like this happened in a recent season of Love is Blind. Do you know this show? No. So Love is Blind is where they take a bunch of singles and they put them in pods.
00:54:00
Speaker
where they can't see each other but they have to talk to each other and then they have a week and by the end of the week you have to propose to somebody without ever seeing them oh this is gross and then you meet each other face to face once you're engaged and then they send you to like mexico or something with all of the other engaged couples people that you've been dating this whole time and you get to see what everybody else looks like and so it's drama yeah it's a great show um but on one of the seasons scripted reality show oh yeah one of the women while in the pods they're not supposed to like say anything about how they look obviously she goes a lot of people tell me i look like megan fox she did not look like megan fox she did not look like me she wasn't ugly but she was like slightly overweight and like she's a little ugly and then
00:54:51
Speaker
and when Don't at us. She was lovely. Don't at me. I forget your name. Whatever. When her fiancé saw her, you walked like you could see his face fall when he saw her. The ink hit it hit right hard. And they obviously didn't make it, but yeah. Good. They shouldn't. This is a dumb show. I think only like two couples have actually lasted from that show. Yeah.
00:55:19
Speaker
There was a, hey, there was a book referenced in this article.
00:55:26
Speaker
Block, delete, move on. I feel like that should be a, that that probably should just be required reading for all youth. For all youth? All youth. Growing up in the social media age, like, yeah.
00:55:44
Speaker
Just block delete, move on. Don't get into internet arguments with people. Block delete, move on in the sense of like when when toxicity enters into your online world. Block delete, move on. Yeah. It's not real if you can turn it off and walk away. There you go. Cyber bully doesn't exist. Just turn off your phone. Just close your computer. Yeah. Just get tougher.
00:56:12
Speaker
Okay, but I did have a question for you. Okay. Have you ever, you don't have to name names, experienced the ick? Sure. Can you give me an example? No. Damn it. It doesn't have to be in a romantic relationship also. I think it's possible that you can like be like really jiving with somebody in a friend way and then they say or do something and you're like, no, done with you.
00:56:41
Speaker
um
00:56:45
Speaker
Yeah, I don't think I have stories that are that you want to be but you're good to tell about that. Certainly I have experiences but that
00:57:01
Speaker
Like I read through this and I was like, okay, I get this. It's a human experience. It's not like, even though I wasn't familiar with it, it's definitely a human experience, not just a generational thing. yeah It's a generation that has come up with the name. but Yeah. Kind of the terminology around it.
00:57:20
Speaker
And it is interesting because it's a little bit, it feels a little irrational. It does. But it's kind of, that's why I like this article because it goes into some of the science pieces too. Okay. Let's talk about the science. Yeah, because I thought it was really interesting. Like you got to listen to yourself and recognize that these are the sort of reasons why this could be happening. And I thought it was really smart. I, while I was reading that, I was thinking back on every time I had ever gotten the ick.
00:57:49
Speaker
and thinking about who that person was and what like, okay, maybe yeah, like toenail fungus is gross, but what?
00:57:59
Speaker
Like, what was the deeper reasoning that my brain was telling me no for this person? Like maybe it was a more fundamental difference that I just didn't want to admit to myself. So my brain was like, it's got a gross toast. Yeah, we're going to find something that really... Exactly. Something that I really can't go back on. I think that's very much where Where I fall in terms of evaluating this it's it's like your subconscious is trying to tell you something Yes, and it's just gonna pick something and it's gonna be like hey red flag ah This is not right It's the gut feeling that we can't really come up with a rational explanation for why we have gut feelings but most people are like I should have trusted my gut like the more they you know, and this could be you know, I survivor bias kind of stuff like oh, yeah, of course you're gonna think oh, I should have listened to my my my gut because you broke up with them and not Eventually, yeah, like but I think it's true. We do have gut instincts We don't really understand where they come from but they could come out in these kinds of judgments. I Read something forever ago. I don't have a citation for this, but oh we don't need it's not a scientific um podcast that the shallow end the gut feeling is
00:59:18
Speaker
Um, like your brain picks up on so many tiny signals throughout the day that you aren't like consciously noticing, but like your subconscious brain is synthesizing and analyzing all that information in the background. And then once it thinks like, Oh, that's dangerous. And we know because we saw all this stuff.
00:59:37
Speaker
It sends you that, like, no feeling. I'm not going to explain this to you right now, but here's a a strong feeling. Yeah, which I like to believe that makes sense. do I think it makes sense. I also I think it makes more sense than like. The like, oh, your vibes are off, you know, well.
01:00:00
Speaker
But why? You're channeling the universe in a very disturbing way. Your energy is negative. Which might be true. Could be true. Could be. you Could be. Hey, I have a question. Oh, gosh. OK.
01:00:18
Speaker
Can you get that cat to stop snoring? The orange cat is snoring. I hope you can't hear him. I hope they can. He's so cute. Yeah, we talked about how British this thing is. Yeah. What does the quote, he was a bit full on? He was a bit full on.
01:00:36
Speaker
It was a bit full-on. Like really intense, really quickly. Oh. like Oh, yeah. OK. Yeah, that makes sense. but Yeah. This is after two dates, this person was sending DVDs to to her work to her work. Which is weird-ist, I think. That's the weirdest part of it. Yeah, that's not cool. Don't send things to her work. That's not cool. Gosh, we're just swarmed by cats right now. We are. Yeah, the the the Britishness was
01:01:05
Speaker
Yeah, it was like I had to fight through it like I was cutting through underbrush Yeah um So there's discussion about whether the ick is a is a dis dysfunctional Adaptation around like fear of intimacy like do you self-sabotage? And I thought that was an interesting component too. I can picture that being the case umm silent And I think I think for Some of the examples that I've seen on TikTok where they're talking about like just a straight up red flag, not an ick. I'm like, you don't like this person. You hate him. yeah Why are you with him? So I think that there are some like circumstances where people assign the ick when it's not an accurate
01:02:02
Speaker
Yeah, yeah label like that's not an ick you just don't like him right you're not into him or That's not an ick you're just It's just like a quirk that he has but you're trying to find a reason to torpedo this relationship Yeah, and I think it could be so it it could be the substitute for some genuine concerns. It was a bit for long. It was a bit for one. um you know You're going too fast. You're too intense for me. you're I like hanging out with you, but I'm not ready to get that serious with you, that kind of stuff.
01:02:46
Speaker
So like i yeah, it's sort of like that could become manufactured at that point. yeah But I think it was interesting to have this the author like walk through how you you know It's okay to trust your ick. It's not necessarily a bad thing, but also you should take some time to evaluate yourself about these things and see if it's really more about you. Maybe poke at that one a little bit. Are you self-sabotaging or are you... Commitment-phobic? Yeah. Or... Oh, they they also talk about how people who are or um like have a history but of overcommitment, like getting too obsessed with somebody too fast,
01:03:28
Speaker
will come up with icks to like slow their own self down. Like, no, I'm not in love with him. He wears a fedora. No, that's not a ick. That's a red flag. You think there's somebody wearing a fedora as a red flag? If a man walked up to me wearing a fedora, I would immediately make multiple assumptions about him. And all of the assumptions I would make about him would be reasons that I wouldn't want to speak to him.
01:03:58
Speaker
oh Wow, okay, and I stand by that so Okay, except my brother who my love dearly and you know, you're not rocks a fedora he does he does but That's he's not a potential mate for you. Sure. I'm talking about any man Regardless. Oh, wow you have I have strong feelings about fedoras. Yes, I
01:04:26
Speaker
uh but if somebody returned your scarf to you and didn't realize that the bag that the scarf had had um dirty socks in it as well okay that on its own is bad enough there's added context to this yeah which is it was his mommy he didn't realize that his socks were in there because his mommy was the one that folded the scarf and put it in that bag because his mommy does his laundry so again this is this de the difference between the ick and legitimate red flags that's a legitimate red flag that one's legit okay you just i'm just making sure i'm understanding this properly yeah that's that that one's real yeah um and i i thought it was really interesting the ick uh being
01:05:18
Speaker
Compared to disgust the human emotion of disgust Because that that's that's I can tell that's literally the feeling people have when it switch flips. Yeah, they're like Yes, and they they have this she talks about how like disgust is One of the human emotions that we kind of like shy away from defining. Yeah Which I find very interesting. Have you seen inside out?
01:05:48
Speaker
Yes, the first one. I think it's a great movie. Everybody should watch it. I think so too. Regardless of age. You should see the second one. It's really funny. um Yeah, but that's it's a very good point. Like I sometimes find myself ashamed of how much disgusts me. Like why are you reacting like that? its It seems irrational.
01:06:16
Speaker
Well, Lots to unpack there. Let's poke at that one, Rebecca. Yeah, let's poke at that one. um According to Dr. Porter, disgust is actually a survival mechanism to warn us against something that's potentially dangerous. So why do you find yourself disgusted by a lot of things around you? Because for a lot of years, everything around you tried to kill you.
01:06:43
Speaker
Yeah, that's true. So, some of it makes sense. Yeah, some of it does make sense. Well, I really enjoyed this article, by the way. I learned a lot about the ick. I enjoyed it. I also enjoyed that she got in and out. Yeah. She was like, right? She made really good points. Made my point. I'm out of here, dude. No word requirement here. She also, well, again, she has a great name. I'm assuming it's a woman. Lottie. It's got to be.
01:07:10
Speaker
um
01:07:13
Speaker
and She even did the the the the right thing, like perfect summary of her article in the last paragraph. Okay. I liked it. okay It's important to keep the ick in check so we don't self-sabotage for the wrong reasons,

Wrapping Up and Farewell

01:07:27
Speaker
but it can also be our body's way of telling us what a potential partner is in a good match. Yes. Maybe it's actually time to lean into the ick and learn more about ourselves and what we want and need from relationships in the process. And if it turns out the ick might have more to tell us than we give it credit for, maybe it's our job to listen. Boom. Way to land the plane, girlfriend. Yeah. She should talk to
01:07:50
Speaker
What's her name? The woman that wrote the other one? Or the AI that wrote the other one? Yeah. Her Royal Highness Nicole Serena Silver. Silver. That even sounds like a fake name. It is for sure. Nicole Serena Silver.
01:08:06
Speaker
And who whose epic quote was the present moment is the futuristic dreams of yesterday. That's like I could I could hang on. Let me come up with a quote. It's like ah it's like a plants are the silent humans among us. Brilliant. Rebecca Rosebrough. I feel like it's I feel like it's a bad translation of like some Chinese hmm like an actually interesting yeah like thinking that somebody said before and Have the right words, but you didn't capture any of the meeting typed it into AI and said reword this for me Yeah, make me sound dumb past it it at the top of her article about ai Anyway, why did we go back to the AI article? because Yeah, don't read the AI article people do and let us know what you think
01:08:55
Speaker
Or do and tell us how we missed the whole point. Or copy and paste it in a chat GPT and say summarize this for me. yeah Yeah. Send us that summary. Send us that. I should have done that. Why didn't I do that? I don't know. because Well, this was fun. This was fun. I feel like I learned a lot. Oh, I know I learned. I think I probably learned a lot more than you. I think so too. And I hope you learned something too, shallow friend. And I hope you come back to join us for the next one.
01:09:25
Speaker
Well, I think we finished everything we needed to do for the entire podcast. We got 21 episodes. Done. That's a throwback. If you don't get that reference, you're not an OG fan. There you go. We love you anyway. All right. We'll see you next time, guys. Thanks for listening. Bye.