Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
HEY A(i)24 - KNOCK IT OFF image

HEY A(i)24 - KNOCK IT OFF

These Guys Got Juice
Avatar
47 Plays11 days ago

Why the industry and audiences alike need to flatly reject AI in filmmaking

It's Labor Day and Tony joins Doug for a special episode discuss the new announcement regarding A24's AI division and what can be done to stop it

Transcript

Lighthearted Introduction

00:00:00
Speaker
um this this is This is a test of the emergency broadcast system. In the event of an actual emergency, you would have been killed.
00:00:17
Speaker
how are you doing? I'm doing well.

Exploring AI in Film: Initial Thoughts

00:00:19
Speaker
um I was going to do some reading on this AI stuff more than i already had, but honestly, at my day was far too busy in the end, so I just didn't get the chance to do so. but It doesn't mean I'm not as fired up and, you know, like, willing to dig in.
00:00:36
Speaker
This is about, I'm not going to say we're good just going to be passionate and and misinformed, but but but but but we can even do some brainstorming.

Practical Approaches to AI in Film

00:00:44
Speaker
You know, this doesn't have to be a long, you know, record a long episode, but.
00:00:48
Speaker
maybe figure out like practical ways because I feel like a lot of like discussion on this has been ah what what are you going to do about it you know like or that it needs to a simple boycott from the the audience level what won't suffice and and ah I agree that there needs to be more, but that doesn't mean that also there shouldn't be a united front for me, you know, of like, did like, no, it needs to be coming from other angles. Like, ah even if the real blow will be to like the the talent, so the the talent and, the you know, filmmaker side.
00:01:25
Speaker
they kind of just need to put their foot down. Cause that's like what eight, eight 24 has built their whole branding on. Like, Hey, look who we, we've assembled all these people, the talent pool for sure.
00:01:37
Speaker
Yeah. Um, ah well I guess we should like do some table setting just for the audience at

The A24 Phenomenon

00:01:41
Speaker
home. Right. Like we are talking about like AI and its ramifications throughout the whole film industry and looking at it from the perspective of the recently announced like eight 24, uh,
00:01:52
Speaker
ah Like this like studio subsection where where they're going to be pushing into blockbuster filmmaking with a reliance on ai animated films. So the idea would be like but the announcement came in. Was it a profile in it wasn't Vanity Fair? The New Yorker.
00:02:09
Speaker
It was in New Yorker. Yeah, and the whole profile kind of like laid out the history of the age twenty four of like, yeah, well, they're to just like lean into controversy or being disruptors. Like they gave the the example of like the Church of Satan endorsing the witch and stuff like that.
00:02:26
Speaker
But I'm like, well, that's like, yeah, that's smart marketing. Like they've kind of been, even though they sometimes have questionable like release strategies, I think they are good about like... ah leaning in or like encouraging their filmmakers to lean in to like, you know, something that might just cause a conversation. Like they want to be part of the culture is what they value. And they were trying to frame this whole AI division as them like pushing that agenda.

Skepticism and Ethics in AI Filmmaking

00:02:51
Speaker
But that's different. like if if And that's why i felt like we needed to talk because like ah if if and there's...
00:03:00
Speaker
if their main goal is to be culturally relevant, relevant part of the conversation. uh, I feel like if all the conversation about this is bad, then they'll have to think twice about it. Like, yeah, yes. We're, we're just two, uh, schmucks. Like we can't on our own, like stop the, the big machine. Although it, and,
00:03:23
Speaker
It's like 24 all is simultaneously too big but also hey they're just a little you're just small birthday boy at the same time like and have it both way and people defending them are trying to have it both ways of like but like well hey I've also seen the sentiment.
00:03:39
Speaker
um um of like, what wouldn't did you expect like a a big production company to be ethical? and that's like the same thing that gets me angry when someone's like, oh, did you expect a politician to be good or something? like, no, I didn't. but Like, I'm still allowed to be like upset. Yeah.
00:03:59
Speaker
like they not Isn't their job to be a good person at their job, right? And isn't like them doing something like that technically a representation of them being bad at their job?
00:04:11
Speaker
And shouldn't we be approaching AI with that same kind of viewpoint is where I'm looking at it, right? Because like a part of the table setting, right? Like think about it from this perspective, right?
00:04:22
Speaker
ah Who went on strike when the strike happened a couple of years back? Right. It was the writers. It was the actors. It was the cinematographers. Right. Not the directors. Right. There's reason for that. Right.
00:04:32
Speaker
There's a reason why storytellers, at least from like the upper echelon level, are staying away from that because ah they see and from their perspective that if AI does become this kind of thing, there's still going to need to be a creative force behind an AI project. Right. because They're safe.
00:04:49
Speaker
They're like, you can't replace what an auteur with AI. Right. Exactly. From their perspective. Right. Well, that's how we're framing it. Right. Because, again, like if you're thinking about an AI film, right, in ah in an AI animated film perspective, if everything is computer generated ah from a visual standpoint, from an actor's likeness standpoint, um even from a scripting standpoint, if they use a language model.
00:05:12
Speaker
eight ah Then at that point, it renders all of those people who are on strike completely useless. Right. But as we already established, the person who's came up with the idea idea and quotations is the one who's leading the project.
00:05:25
Speaker
So bring that back into Right. Right. um You're right in the sense where these are small being in quotations, right? They were a smaller studio that was able to gain prominence because they threw a bit of money at up and coming filmmakers or established crates who are having trouble getting funding for the next projects.
00:05:46
Speaker
And they got big results. And the reality is is they kind of modeled after Blumhouse. They did the same thing where it's like ah they only spend a couple million dollars on something, right? And the budget's only around that ballpark anyways, right?
00:05:58
Speaker
Or they pick up a film at distribution ah a films distribution at film festivals and it turns a profit because it was such an ultra in indie movie, movie right? And now we're at the point where um AI itself will cost a lot of money through the energy production ah But ah if it's made in house like this with these big names that they've essentially farmed out through a decade of building up goodwill at film festivals and stuff, they'll be able to attract a lot of big names who would be open to this idea based on their relationships in the past.
00:06:29
Speaker
I would even compare it to something like ah David Fincher with Netflix, right? We can all agree that like the streaming sphere has been bad for filmmaking in general, right? And while we appreciate David Fincher's contributions as a filmmaker, right?
00:06:43
Speaker
We cannot ignore the fact that his ah direct involvement and ah appraisal of the studio Netflix is allowing for it to kind of move forward on question. It gives it legitimacy, right?
00:06:54
Speaker
And that's the big question. question and problem and all of this, right, is how much legitimacy are we willing to afford ah the other side in this conversation, the tech billionaires who are trying to infiltrate artistic spaces, right?
00:07:07
Speaker
And that's where we come in. That's where the normal guy comes in because we are the people who are the buffer to what is allowed and not allowed, and that comes from dollars and from speaking. Yeah, I think we can vote with our wallets. Like, it's not, like, the only... ah yeah It's not going to be the only measure to to to put a stop to this, but they do need us to consume the product. So um that that that will hurt them down the line if there's any kind of of if any kind of boycott games traction. Because, like, honestly, where I'm at now, like, unless I...
00:07:42
Speaker
ah hear some major like assurances of like, what you know, and and I don't want, I i fully don't want AI full stop, but I am willing to consider like if they, if they, I, I don't even know what concessions those would be until I hear it. It's kind of like the question.
00:08:02
Speaker
Like, ah ah but but if if they're guaranteeing that it's basically comes down to a labor thing. Like if if if we're if you're if you can like show me 100% that this is not going to be replacing anyone, then i would, i add fine, but I also don't believe that. i also wouldn't believe it if you if they said that. So, um i ah but where I'm at, um I kind of, I'm,
00:08:27
Speaker
just gonna be pirating all their movies going forward like i like yeah do I need to see it in theaters like there's lots of other stuff for me to see and um it it it I i just ah don't want to support though I mean it's that that that's what it comes down to and and yeah like me alone and whoever's listening that's up to that i like yeah I know that you know I assume lot the listener base are you know movie nerds, film buffs, people who go the movies quite often. So you're like, oh, well, that's I go see everything. where You want me to not just not see this? like
00:09:03
Speaker
Yeah, but there's also, it's not like there's just nothing else for you to see while this is happening. There's plenty of other things. And also, there's tough stuff is on demand in like a month. So, you know, like, I don't know, just wait a month and then download it.
00:09:20
Speaker
Look, if you have to see it opening weekend, right? And I always say this to people and some people freak out when I say this. Just buy a ticket for another movie, you know? if you If you truly need to see it that weekend for whatever reason, right? Just buy a ticket for another movie. Give give a better movie your money rather than, you know, ah funding something that you may not agree with.
00:09:40
Speaker
Because, like, The thing is, that's the way that AI should be treated, right? We should be able to, like, look at it like food and know if there have been preservatives added, right? And it's different from food in the sense, well, it's actually the exact same as food, where, ah you know, um there are chocolate bars with, like, cruelty-free, you know, like they didn't use, like, slave labor to make it, right?
00:10:00
Speaker
there Movies should have that same kind of... ah respect given to it, especially considering this is not a creative tool. This is a political one. And I'm glad that you brought up the ah thought that this was, ah we don't trust them, right?
00:10:15
Speaker
and And this goes into a bigger topic with ai in general, but because there's this surrounding topic of how the bubble is ah going to burst very soon, which renders this entire conversation obsolete, if that's true, right? And I'll get into that later on.
00:10:28
Speaker
ah But there... there Just taking them at their word. That's the way I'd phrase this, right? Like, let's take the AI people at their word, right? They say that ah these AI slot video creations are going to replace um traditional footage, right?
00:10:46
Speaker
When was the last time you wanted to sit down and watch some AI content, right? Some people do do that, right? But what's the context of the people who do watch AI? It's through like TikTok reels. It's through Facebook posts, right?
00:10:59
Speaker
ah the the The concept of trying to sell somebody on the idea that like sitting in a theater and spending $10 to watch something that's entirely AI generated, I don't think it's there.
00:11:10
Speaker
It's the same thing we're watching with the MCU, right? and You had all these projects flood the streaming services, right? um It dev devalued their market, right? It devalued their property, right?
00:11:22
Speaker
And it trained the audience to think, okay, well, instead of going to the movie theater to watch a Marvel movie, I wait for it on Disney Plus a month later. And the same thing happened to Star Wars 2. Right. The thing is, is that these corporate people, they're getting the way of their own ideas. Right.
00:11:36
Speaker
Going back to how they we can't trust them. We we haven't seen a full feature that's been animated in AI. That's like provably good in that sense. so We've only seen like small production design elements that have been inserted here and there to test the waters.
00:11:51
Speaker
And every time it's happened, it's been major news. And the reason it's major news is because even like the layman film fan. detests AI. And the reason that they do, and the reason that this we have a whole ah lot whole lot more power than people like to think is because this whole problem goes away the more educated you are.
00:12:12
Speaker
Just the moment that you just start reading a bit more about it. And um and I even started this by saying I didn't i and didn't do enough research going into this. But I'm even just saying just on a basic level, just understanding how the tech works, understanding like where it does get implemented, understanding how many jobs it does affect down the line.
00:12:30
Speaker
If you just understand these few things, right, any conversation you get in, whether it be from late night with the devil to like the brutalist, right, all of those conversations become meaningless because the implications are far greater than, know, what momentary thing you can excuse. Right.
00:12:46
Speaker
It's not just that one movie. And then i'll also besides those examples, I remember um Alex Garland's Civil War had those posters, which were were a and like I like in hindsight. it's like, yeah, they're testing the waters of like what is allowed. I mean, people clocked it.
00:13:07
Speaker
then but also I maybe just not enough of a stink from their measurement of like okay well you caught us there but yeah you know they so they still want to do you know end goal is a full movie AI and and and like we've alluded to before like this technology besides the environmental cost of it like there's an actual like cost cost to it where it's like are you where Are they even saving money by due? Because that's what it comes down to is they they don't want to like pay people and like they they like this is just a way for them to, you know, circumvent that. So but it's like how much in labor are you even saving besides like the soullessness of the product that would come out?
00:13:54
Speaker
Tons. They're saving tons in labor because think about it from this perspective, Doug, right? The concept of reshoots are done if AI gets through, right? Because now it's like you don't have to go back to the set, you know, you don't have to go back to the location, worry about weather and all that stuff, right?
00:14:10
Speaker
You don't have to spend a bunch of money on lighting and catering and all that stuff to just do that again, right? If you create something in quotations in AI and then you watch the final product back and then you say, but it didn't really work in this part, you know, you can just adjust that, right? That's what is alluring to them, right? Because the that promise sounds so perfect in their perspective, right?
00:14:31
Speaker
The problem is the format itself. The problem is, is that like, I'm not even talking about it from the animation being good enough perspective, right? I'm talking about it from like,
00:14:42
Speaker
there is an integrity to different formats, right? There's a reason why somebody doesn't, you know, usually pay to see a digital video movie on screen, right? ah There's a reason why DV tape didn't take off the way that it was supposed to with indie films, right?
00:14:57
Speaker
and These formats ah create an association within the mind's eye of the consumer. And if it's devalued through the the way that it's initially presented, then there is a ceiling on the success it can reach.
00:15:09
Speaker
And when I look at AI success and reach, I don't see it reaching through to full length feature film unless it's that moneyed interest that pushes it forward. So that's where this whole conversation and the debate comes in.
00:15:22
Speaker
Right. But it's it's really a a game of hucksters who are tricking film executives into thinking that this is the next big thing and they'll make a lot of money from it. And what they're going to end up doing instead is just doing a get rich quick scheme on these studios.
00:15:35
Speaker
And on to some degree, these studios deserve it. And it'll be funny to watch them buy into this tech and to watch them fail. But the reality is, is that general people beyond like, again, those Facebook memes, you know, I don't think they're going to pay for this. I don't think that they're actually genuinely interested in seeing more AI content. I i want to believe that's true. And I think to an extent that it is. It's it's just that now there has there's more scrutiny every time and something's getting promoted or there's some kind of like image where I'm like,
00:16:08
Speaker
Is this just like shouldily rendered or is this a like like it's going to be harder to tell the difference. And then that kind of fuzziness will allow them to slip that because if there's no right now, there's no law that says you have to disclose that up front in your marketing for the movie, which there should be. And that's sort the problem that that's where the problem stems from. There needs to be regulation.
00:16:30
Speaker
that's That's where all of this comes from, because the the the the governments, all of them, right, are essentially just going, go wild, right? they're they're They're just saying, like, you you go and radically work in the AI centers, you make this as highly capable as possible, and then they'll come back and cut, right?
00:16:49
Speaker
The problem is... It's that they've given them this massive leash where they can just walk wherever they want, right? But they haven't really accomplished any new goals. They haven't really, yeah because they haven't really like and innovated in the way that they thought they could. And and they're it's the same as other tech, right?
00:17:07
Speaker
like ah AR, right? Like they always say like VR and AR are like the next big technological things in terms of replacing your cell phones, right? and they've been saying it for like 30 years now.
00:17:17
Speaker
And the tech isn't just like, it's just not there, right? And when I look at current AI, I just don't We all have Google Glass. I don't know what you're talking about. so i'm on I'm wearing it right now. you there know you The viewers, the listeners can't see, but i'm I have my Dragon Ball Z scouter and I'm scanning people's power levels.
00:17:35
Speaker
what in what That's in one lens. The other lens is I'm watching porn. I'm watching porn and then, you know, scanning everyone. See, like you you can have two level 9000s, you know, like one a threat level that's on the street, the other just and level 9000 grade A porn star.
00:17:51
Speaker
Yeah. No, but yeah, that hasn't taken like a lot of new tech they've tried to put like in in some of the stuff I do think. you know, are things with but like VR, for example, I feel like is of a huge untapped like potential for gaming, gay even interactive storytelling experience, like a a lot of different things that could be done. Also, there probably are, you know, I also just don't you know it's an expensive buy-in so I've only experienced it through like other people's equipment so maybe there is like it's gone to the next level already like a AI has already had it's not AI vr has already had it's avatar and I just don't know about it I don't think they have yeah maybe Half-Life Alyx if you're really twisting arms right but like they're supposed to be good I mean haven't played it but at least one
00:18:43
Speaker
You're referring to like killer apps, right? You're also referring to like ah the consumer themselves taking ownership over the tech, right? And that's something that the VR and the ar industry hasn't really experienced yet, right? Because the proliferation isn't there. The tech isn't cheap enough, right?
00:18:59
Speaker
AI hast has almost the inverse problem, right? Where it's like all of this stuff is available to the public. you could You could make your own language model right now if you wanted to, right? they Like the the the code is out there for you to experiment as well as ah there are ways for you to work with pre-established tools ah to test their limits, right?
00:19:19
Speaker
The problem is is that ah even though it's given you these ah firm design philosophies but for what ah AI tech should be used for, um it's not as efficient in those fields as people had once thought, you know.
00:19:33
Speaker
It may help with like some minor research, but even in those fields, it fails, right? ah So the the reality is, is that there's so much of this promise of what AI could be, right? In so many different fields, you know.
00:19:45
Speaker
medical fields, you know, like lawyers even have talked about it, right? But you're not seeing that transition happening on those levels because the tech just isn't there. And then the hobbyists who are supposed to be pushing that tech forward aren't ah able to find like those exact use cases where they can replace people.
00:20:03
Speaker
that that With any invention, right? An invention is made for a purpose, right? And then the invention is then co-opted by the people, right? The people then find new use cases for it.
00:20:13
Speaker
And sometimes those inventions ah but are predominantly used in those other use cases rather than what it was originally invented, right? The problem with AI is that we're constantly being told what it's being used for.
00:20:25
Speaker
Right. But it's not actually being organically used on the user end in that same degree. Right. And so you're just getting all of this funding and like film and television because they see it as an easy way integrate this tech.
00:20:36
Speaker
And every time they do so, they just fall fall flat on their face because, again, there's an abrasion there. There's a lack of understanding from the people. Right. And in order for there to be understanding from the people, there needs to be trust.
00:20:48
Speaker
Right. And they're doing nothing in terms of instilling trust within the audience space. Right, because like we've we've said before, like there's no, they don't need to tell us when it, it it's just a matter of we figure it out because like it doesn't look indistinguishable from actual art yet, you know, so like we clock it and then they're like, oops.
00:21:10
Speaker
um And so... there there There is a desperation there, too. You can feel like how hard they are trying to push in all these different industries. And now with film, um I hope like you alluded to before, like an incoming bubble. I mean, like Meta has the laid off a lot.
00:21:31
Speaker
People from like their their AI division. I feel like I've heard other companies, too, or. If not like shuttering that division, they're definitely like downscaling it. It's like that. The initial like boom that they thought was going to happen just that didn't take. So I, I just don't see how it's going to be profitable to match the amount of money that they put into it because they really do need it to succeed on that huge scale. And i it just doesn't seem like it's going to happen. So I, that's what I'm hoping. with Fingers crossed that, that, that is coming. well Well, the reason that this whole bubble conversation started was because ChatGPT had its ah five update, right?
00:22:11
Speaker
And that was the thing that inspired this whole pushback was that it was promising that it was going to be a much higher degree, more talented than what ChatGPT4 was, right? And it didn't meet those meet those expectations. It actually far... ah like failed to meet those expectations.
00:22:27
Speaker
So like I was saying before, there's a ceiling, right? Of how much this tech can evolve, right? And so part of what that ceiling is, is the power generation, right? Like literally like those generated farms that have to infect people's water supplies that aren't for them to work, right?
00:22:45
Speaker
If that's like what it takes to make something that isn't quite there work, right? People aren't going to be on board, right? Right. And ah the the interesting thing is if this bubble does burst and you have all of these film studios who are investing heavily in this tech. Right.
00:23:03
Speaker
The problem is, is that they are invested, period. Right. So the only reason we're seeing them try this tech out is because they think they're pushing it forward. But really, they're they've hooked their their line to a failing sinking ship.
00:23:17
Speaker
Right. it's It's as if like. like 10 years ago, but like a bunch of people invested in the high frame rate tech that yeah he was developing. Right. And then that goes nowhere. Right.
00:23:27
Speaker
Like that kind of did happen with like the Hobbit movies and even Avatar. Right. right It doesn't catch on because like it doesn't really add as much. Right. That it infers that it's going to. And if the audience can't feel that,
00:23:40
Speaker
And if it there's not enough of an incentive for the filmmaker to take that on, ah then it completely falls apart. That's why this AI conversation has been spinning in the wheels this entire time.
00:23:51
Speaker
And every time this conversation comes up, You get the same kind of like arguments from people. We were talking about like the nihilism, right? The people have words like, what can we do about this? Right. But it's like what we've already done is already pushed back pretty far to the point where it's like these ah films have to come out first, then reveal that they use AI. Right.
00:24:11
Speaker
They're ashamed that they do it now and they have to come out and it has to get revealed. Right. They know getting caught doing it is bad for them. Totally. And like, look, I brought up Brady Corbett earlier, right? Like now he's like so consumed with guilt about it. Like he's he went to a film festival and tried to say like, oh, F.W. Murnau would have used AI. And it's like that's desperation at that point. Right. that That's that's you trying to imagine like how a great filmmaker in your perspective would have worked with this.
00:24:41
Speaker
And that's why we need to call out the Radu Joods, you know, that's why we need to call out the people who made Late Night note with the Devil, right? Because even if they're trying to do this on like a very ultra indie level, they're ah their roots are in corporate interest, right?
00:24:57
Speaker
ah Talk about major sellouts, right? I thought we hated sellouts, right? Let's fucking kick them to the curb if that's what they're doing, right? They're just moving forward corporate interests. If we allow it commentary on like exploitation or whatever, I haven't seen the film, but like it there's there's no in my mind, no point that ne necessitates that.
00:25:19
Speaker
Well, like you, I can just picture like the enlightened senderist film critic, right? Who goes like, oh, well, you haven't seen the film, so you don't know what point they're trying to make. Right. and and And the problem in this conversation is that their point doesn't matter.
00:25:32
Speaker
They use the tech. Right. we've already said. Right. It's not a creative decision. It's a political one. Right. So the moment that you do that, right, you're actually signaling to something much larger. Right. Even if you were making commentary on it, like, you know, I know someone to make a um movie about how murder is bad, then I still killed someone.
00:25:54
Speaker
exactly Yeah, Why? Why? do i Why? Why couldn't that point be made without the the evil being committed? Yeah, you you shouldn't be applauding someone who's taking part in that thing, even if they're doing that in a comedic sense, you know? um Yeah, I'm doing it ironically. It's just ironic. You don't get it. Am I even seriously doing it You know who's doing that right now?
00:26:19
Speaker
Fucking South Park. You know, they're using AI in a sarcastic, you know, so satirical way. Right. And if we're going to. Are they for like their writing or animation or like in which. For their animation. Like theyre they they've invested in deep fake tech. And in like the newest intro of the season, they had like a Donald Trump deep fake. Right.
00:26:38
Speaker
and and that And also it should be noted that apparently that that tech is going to be used in the film ah that him and kentra them and Kendrick Lamar are doing. So that makes me a little worried about that one. ah But anyways, um when it comes to ah what I'm bringing up with this, right, ah we see South Park is cheap.
00:26:55
Speaker
irritate We don't see it. ah Even the fans of it, they they look at it and they say like, oh, that's kind of it's satire, but it's low-speed denominator, kind of libertarian, feeds into the same sensibility sometimes.
00:27:07
Speaker
If Radu Jude, who is an arthouse filmmaker, is feeding into the same sensibilities that South Park is, right? Do they really deserve to be called like for boundary pushing filmmakers? Do they deserve to be called like somebody who's pushing the medium in a good direction if they are partaking in that same tech?
00:27:23
Speaker
And that conversation still leads into somebody like a Harmony Kareem, right? Or it's like he's built up this good will for decades making boundary pushing art, right? And now he's just making shitposts, right?
00:27:35
Speaker
And people are going, well, would you expect from Harmony Kareem? He always makes shitposts. The problem is is his shitposts in the past didn't have like meaningful harm in the outcome of making them. Right.
00:27:47
Speaker
Like even a trash humpers, right. Like is not the same degree of like harm in that sense, you know, where you just have like ah non-professional actors doing weird things on camera. Right.
00:27:57
Speaker
It's closer in in a relation to like fight harm, the unproduced film that he made. Right. these are These are things that actually could produce harm in what it does in the future. um And people are giving it a pass just because that's what their expectation of Kareen is.
00:28:11
Speaker
And that is a wrong, like it's a wrong instant. We shouldn't be letting people slide because of our preconceived notions of who they are. If they are doing something that is wrong and that is in line with their worldview, guess what? It's still wrong, right? And we should still talk about it and criticize it.
00:28:29
Speaker
People give them pass ah ah passes, for example, like that because, yeah, there's been a precedent set of like, oh, this boundary pushing or they're provocateurs or whatever. But then there's also just like this fandom mindset that like it is. Some people like to think that like, oh, if they're not like Marvel or superhero fans that they're above it. But no, it's like ah infiltrated like any kind of consumption in media consumption. And and in that goes down to like independent films. Like the people or the brands are like their identity. And age 24 encourages that. Like they have a subscription where, they you know, you can get.
00:29:07
Speaker
stuff. fire yeah I don't know what you even get with the AAA24, I guess, merch or maybe like early ticket and screenings or stuff like that. but Yeah, but it's like ah the rappers like from a movie or something. don't know.
00:29:21
Speaker
Some bullshit like that. But but do people eat it up like that. Definitely like part of their whole thing and part of their success is that they've cultivated this kind of loyal following and that but that is dangerous because then you get large swaths of people who then are carrying water for these corporations and they're like no no no it's fine for these and these reasons most of the time it's not even they're they're not even coming at it from like i like this particular filmmaker i want to see what they do with like ai for example like i think it
00:29:52
Speaker
purely just the brand, like A24 itself. Like not even if they were of coming at from like a direct, because like, yes, that also exists too, where people like treat their favorite filmmakers like sports teams that can do no wrong and you like cheer, you know, aggressively for them. But like either coming at it from either angle is bad when it comes to like, a if you're being blinded to, uh,
00:30:18
Speaker
you know, these these bad practices are just making excuses for them, then then you're not really a fan of the art. Like, you're a fan of the consumption and, like, that bed in that how that defines your identity. Like, that's that's the problem. is that's so So many people now are...
00:30:38
Speaker
And i and i'm I don't want to make it sound like I'm saying like being into anything like ah I'm not saying like being passionate about something is is bad. It's that when someone's like this, this ah is like full, fully my soul defining thing of like my identity is like this company or this brand or whatever. Like that is scary. That's dangerous.
00:31:02
Speaker
Mm-hmm. but you know what it's like it's like ikea right twenty four resembles ikea more than it does like a film business because they focus so much on their brand loyalty right as you've already outlined right ah the when you buy an a twenty four half when you buy an a twenty four shirt right there's the implication that you're like supporting the little guys right you You are in support of a brand that supports smaller filmmakers and therefore they're just in your eyes, right?
00:31:29
Speaker
So there's that association that there's good feelings there, right? Bringing it back to Ikea, you know, you go there and you buy furniture, but you also get the meatballs, right? And the meatballs are delicious, right?
00:31:40
Speaker
Say they're at A24, you know, you buy their hats or you buy like even the Choclofars, right? They're delicious, right? But then you... that distracts you, right, from the underlining ah mechanics of the industry, right?
00:31:51
Speaker
At that point, you've bought into whatever they've sold you. And now you, ah like you have said, it's like a sports team or something where there is that brand loyalty and they have to push back almost instinctively because they've treated they've treated it like it's a homegrown, right? When a lot of the time it's distributed um from film film festivals or that if they are produced under that umbrella, it's like with an original voice who's just told what they're doing. they can do, right?
00:32:17
Speaker
The thing is, is that, like, ah you know, people ah have these troubles, ah so so like, fighting back against a brand, right?
00:32:27
Speaker
Or they try to, you know, say like, oh, the good that they've done is outweighed the bad that they can do, right? The problem is that a studio is a ticking time bomb. It's a business, right? The harm that they're going to create, right, is inevitable, right?
00:32:43
Speaker
And it's about mitigating that harm through proper channels, you know, through making sure that the people in charge don't go down those paths. You know, all corporations don't have to be evil, but most of them really are because of the way they were run. Right.
00:32:56
Speaker
um So at which point does the band outweigh that? Like, are if you just wait till it reaches that point, then it's too late. Like the the the. We need to be making us think about it now before it reaches that point.
00:33:11
Speaker
and that's That's why I wanted to do this recording. Well, like, and it comes, for me, the reason I wanted to do it is because of the, like, nihilistic pushback, right? Because ah because there are these people who i come at it not from the perspective of I want to defend A24, but from the people who say, like,
00:33:29
Speaker
oh, well, you know, there's AI in video games when you're fighting against like ah NPCs, you know, so like to what degree is AI even, you know, a worthy conversation, right?
00:33:41
Speaker
And it's like these semantic arguments, is this whole question of like how much AI is ah okay and and what can we permit from the people that we like needs to die if we want to have any kind of forward momentum in this conversation, right? Not with between you and I, I feel like we're in agreement on that.
00:33:57
Speaker
But just it but in terms of like ah in order for us to have these honest conversations about these ai use cases, we actually have to talk about those ai use cases rather than philosophizing on like when the right case comes up.
00:34:11
Speaker
Because the problem is that we like that's a we won't know until we see it or a that comes after regulation kind of thing. Right. We can't imply that now. That's theoretical. That doesn't exist yet.
00:34:23
Speaker
Yeah. And like also we can't like imagine ah what would be the right one from the consumer standpoint, because then at that point we are doing the work of the studios. At that point, we are market testing for them and that's what they want.
00:34:34
Speaker
Right. So what what we need to do is literally ask for more, is literally beg for more integrity from their end to know that they won't do these things. Right. And it's it's a it's not even a tough question. It's not even a tough ask. Right. It's literally just like if those things are announced or something, it's literally just asking simple questions because those are the things that piss off the ai people the most.
00:34:56
Speaker
but Just the slightest pushback makes them foam with the mouth and freak out. Right, yeah, and that's what we need to just continually be applying that pressure and be pushing back. ah Just real quickly before we get too far away from had talked about the the idea that A24 fans like feel like they're supporting the little guy.
00:35:17
Speaker
They're not even the little guy anymore. anymore like that They were just recently valued at $3.5 billion and in that same profile, which was talking about this AI division,
00:35:30
Speaker
they are saying they're moving away from things of, like, a Moonlight size. The Moonlight, a movie that won them ah the Academy Award. like probably That would also probably include things of a Nora size, another movie that won Best Picture, that they want to go towards more blockbuster.
00:35:47
Speaker
And, like, devoid of any of this other troubling context, sure, would I be interested in what, like, 824 blockbusters would look like? Because, like, could that diversify that space? Because, like, I think that...
00:35:59
Speaker
There needs to be some kind of revitalization of like what even a blockbuster could be. Sure. But I i do. i I don't trust H24 to be able to do that, given like everything that is leading to to to this scenario. and And the fact that like the...
00:36:17
Speaker
The kind of movies that gave them the name that they are, they don't seem that interested in making anymore. like Stuff of that size now, it has its own division called A24 Platform. like a Friendship would came out earlier this year. That was an A24 Platform thing. So it's like, why is that the subdivision of the thing when that should be the main...
00:36:37
Speaker
thing that should be that's what made the company is doing movies like that so if that's now like a side quest for them I to me that's troubling like then that that that just goes along with the whole ah ai thing of like then what the fuck are you guys even about Yeah, it's it's a problem of brand identity at that point, right? Because like like, as you said, there is that years built of goodwill, right? But at the same time, they were never really our friends.
00:37:05
Speaker
Just like how Neon isn't really our friends or Mubi or any of those companies, right? They are in it for the money, the cheddar, the dollar bills, right? And even if something is going to make like no money, right, they can write it off and it's no big deal, right?
00:37:17
Speaker
So the the thing is, is that, you know, ah like... These these studios, they present a good alternative in the sense that they present as the outsider. You know, they they come in and they say, like, we're here to game change. We're here rattle feathers and and change the ways that things have been done. Right.
00:37:34
Speaker
And then when they are in power, they just do the exact same things. And the reality is, is this the same problem in any industry? Right. You can't change a system from within. You can't like set those precedents if you're just playing within the rules that the system already pushes for. Right.
00:37:51
Speaker
And in their mind, them trying to lead the charge on this AI move. Right. They think that that's where blockbusters are going. Bad chance. I don't think a Universal film is going to try to make like all ai like right now.
00:38:04
Speaker
I don't think ah even Sony or Disney would do that. Right. The reality is, is that like they are the only ones who are going to push that way because they see that itself as a Maverick move. They see that as pushing the little guys. They see giving like Brady Corbett like $10 million dollars to make an animated A24 film. Right.
00:38:25
Speaker
They see that as something that would. um be the game changer itself in a maverick way like a moonlight even if the goals are different in a different way right so what i'm trying to what i'm trying to get at is like it's still within their mission right but the problem is that mission has now been warped to something that's so like baseline it's so like it's feeding the uh the ultimate cost analysis spreadsheet style of thinking to the point where it's transparently evil.
00:38:56
Speaker
It's something to where I don't think the audience is even going to buy because it is such a departure from what they had given before. And the the issue with Mavericks and disruptor, disrupting for the sake of disrupting is you, at least all examples I can think of, you end up with a worse product at the end. Like we mentioned Netflix, like like they're like, we're going to disrupt Valencia. Not just like the boot movie going experience, but also the TV business model.
00:39:25
Speaker
And you end up with a lot of just shitty slop content and and not even like, yes, the they still... do occasionally give good filmmakers the means to make movies, but then those movies get buried. And the the then we now end up in with a streaming landscape that's more convoluted and probably more, I mean, i was never of age or when I had to pay for like premium cable itself, so I don't have the bills to compare, but it's like seemingly more cost-effective, it's more costly to now It doesn't make any sense when they're all charging like 20 bucks a month, right?
00:40:03
Speaker
And piracy is free, right? the the The problem here is actually the commodification of art. That's what I say, right? And it stems into many aspects of the film conversation because like we had talked about on a previous episode, how Letterboxd is ah destroying the film criticism sphere, right?
00:40:20
Speaker
And that is the same problem that we see in streaming, right? Where it's like, in theory, you get a very ah large collection of smaller filmmakers who can get their films greenlit because they work with these studios and they ah cost a lot less to make, right? But then they get buried, like you said, right?
00:40:37
Speaker
And then there's this AI ah thing that's coming up where they're like, oh, well, you know, we can have an original voice behind this and it won't cost as much to actually shoot the thing. They can just make it based on their ideas, right?
00:40:47
Speaker
In theory, the concept is that they are giving the power back to the artists in the sense where it's like you get a little less and you can make whatever you want and you're good. The problem is, is that it's being done in the service of these corporations that care about just serving you an endless platter of stuff.
00:41:05
Speaker
Right. And so in the and the key word there is stuff. Right. You stop seeing it as like pieces of art to consume and mull over and, you know, critique.
00:41:16
Speaker
And instead it just becomes this endless conveyor belt of things to consume. and it's just, lot you know, for the moment. Yeah. Well, like it's slop, but then it's also like, even the things themselves that are made, i would hesitate, like in the streaming sense, um I would hesitate to call slop because, you know, they can be made by people with like true perspectives and they can really do something with those budgets.
00:41:38
Speaker
But the problem is, is that, uh, From the audience's standpoint, they don't give it the same respect, right? So um in the effort of trying to commodify the art so that it's for everyone, that, you know, anyone at any scale can make anything as long as they adhere to these rules, the corporations themselves are creating the framework to destroy the industry itself.
00:41:59
Speaker
This is how our collapses, right? And the only way that people would be able to make their own or to do something on their own is literally just to work outside of the studio system at this point. Which we're probably going to see more of. ah Like, that will be, like, the actual, like, maverick filmmaking of, like, we we're just completely devoid of of any studio. Just do it. Shoot it yourself.
00:42:23
Speaker
Get it out there. I mean, that's... And I think we'll probably see interesting things from that. I don't want a full total collapse of it like, ideally, these studios should be able to figure out in real time how to course correct. But it's probably going to take some major.
00:42:42
Speaker
Like catastrophe for for them, it's ah it's going to need to hurt them in their wallets. I mean, that's I can tell you how partly where we come in. Well, yeah the the major collapse is going to come from Hollywood itself, the place.
00:42:56
Speaker
but like Hollywood itself, right? They've been talking about for years how they have these studio lots that just sit empty, right? And there's nothing being shot on these lots because like there's nothing being made there because it's too expensive, right?
00:43:08
Speaker
And Hollywood and LA itself is experiencing this massive financial like catastrophe on the horizon from a business standpoint, right? And when I look at these AI talks... There's an exodus like stuff that's just filmed like...
00:43:21
Speaker
Either, you know, there's like Georgia or other places. that Yeah, places with better taxes had to or just overseas, like Eastern Europe. a lot of stuff is just shot there. Yeah, totally. And why wouldn't they? Right. Like Hollywood is not the place for these things to get done anymore. Instead, it's like the places where the businesses happen and the deals happen. Right. Yeah.
00:43:38
Speaker
And the problem is, is those deals are getting worse. They're making less money, right, by their stupid decisions, right? So when I say the collapse of the industry, I'm talking about specifically the collapse of Hollywood.
00:43:49
Speaker
And that's on the horizon, I believe, because of these ah mismanagements that have been happening for decades. You know, these attempts at cinematic universes with giant A-listers, right?
00:43:59
Speaker
And now with this incorporation of AI tech that could possibly fail tomorrow, right? It sounds to me like these people are just buying into quick, get rich quick schemes and they're not actually planning for longevity.
00:44:12
Speaker
And that's why personally I've abstained from a lot of Hollywood produced stuff because and the way I see it, it's like that's those productions are the thing that's going to destroy blockbusters in a massive level.
00:44:23
Speaker
And the only way that those things could be done is when people from... other cities, other parts of the world are able to fill that gap. And we're already starting to see that and something like parasites winning best picture is not a mistake.
00:44:35
Speaker
Right. That's right. That's sort that's a reflection of the way that America's control over the monoculture is done. Right. And it's not even because of like their winning power on the global scale, but that certainly has ah ah an an element to it. Right. It's like connected, but it's not the reason. Right. The reason is that literally they're just not putting out the hits anymore.
00:44:57
Speaker
they they And it's because of their poor business decisions. And so the more that they do that, the more they'll point the fingers in other places and the more we can laugh. but I'll laugh, but there'll also be as a sadness to it because I love movie and I know that like the business itself are not, it's not the same thing as the art, but I feel like The art will suffer. I mean, even if we do see cool, like, you know, stuff outside the this the system, like there, are if like a total Hollywood collapse happens, that's going to or and everything. Everyone in the cast needs to be an A-lister. Like, no, that doesn't need to be the case. Everything just looks like iPhone wallpapers now. And it just annoys me, you know.
00:45:44
Speaker
And the the good news about a collapse of a Hollywood system is that it allows for other people to step up in the power vacuum. Right. And if if something like that were to happen, right, I just imagine another city in America would take on that mantle for America.
00:45:59
Speaker
Right. like a New York or in Atlanta or, you know, there's like dozens of other places I could list that are have major filmmaking hubs, even like Texas, right? There are places where these like areas could flourish, right?
00:46:13
Speaker
And the reason that Hollywood exists and continues to flourish today is just because of a consolidation of power, right? And Also, something we want to get back to, I wanted to get back to from the front of this conversation. We were talking about how this was a piece that came from the New Yorker, right?
00:46:29
Speaker
We got to talk about what that purpose of that piece was because, like, that piece was written for the New Yorker, right? It's like a Socratic paper, right? A lot of, like, squishy libs, one percenters read that paper, right? And a lot of that article is not so much, like, a fear-mongering thing. It's more so something that's talking about the perspective, like,
00:46:49
Speaker
look at the possibilities that can happen with this. It's actually advertisement to, ah you know, neoliberal filmmakers who may be adjacent to A24, who may be interested in that program. Right.
00:47:00
Speaker
So when we're reading that article, we have to look at it from that perspective as well, where it's like they haven't fully sold those artists on that tech and they're still trying to court them in. And this is how they're doing it. and And we've even seen besides this, that kind of profile that's, you know, trying to ah do propaganda for them, but that they've also kind of been going around to their various yeah filmmakers and kind of just getting a read. They're trying to read the room and be like, hey, you guys, do you be into this? Yeah.
00:47:32
Speaker
Which that seems weird to do it after you've invested the money, like maybe like read the room first and then make the decision of like you, you buy, you just jump in to the pool and then, and then you're checking if there's water after you do like, I'm just dying. I'm just going to dive.
00:47:48
Speaker
we'll We'll think about the language, right? Because they're just parroting the language of the tech billionaires, right? They're going, it's the future. It's the future. you're not so so If you're not here with it, right, it's going to lead you in the past, right? That's the language that's around it, right?
00:48:00
Speaker
So in the studios, like the stems the dumb studio headspace, right, they're going... oh no, if we don't invest in this tech now, someone else is going to do Exactly, right? And then they end up being the torchmasters. They end up being the people who are actually leading the charge, right? So in in their own perspective, they they may not even fully want to commit to it, right?
00:48:20
Speaker
But they see in their minds, they're like, we have to because this is what everyone else is doing. When in actuality, they are the ones doing it and they are the ones setting the precedent in doing so, right? So that's why, again, calling these people out is so important because, you know,
00:48:34
Speaker
if they themselves are kind of like tiptoeing around it and trying to see what they can get working, if they get loudly told, hey, this is stupid and cut it out and it affects their bottom line enough, they're going to push back.
00:48:47
Speaker
And and yeah the word boycott has been brought up already, right? The thing is, is like decentralized boycott movements, they're not like effective in the traditional sense. But what they are good at is they're good at ah advertising in a propaganda sense, right?
00:49:04
Speaker
Exactly. and so even So even if like you call for a boycott, right? And you see no good results from it, right? If the conversation around the topic changes after the fact, then the boycott was successful because then you've created a more informed viewing base. And you saw that with Late Night, The Devil specifically, where that movie made money on its opening weekend, but now its reputation is really tarnished.
00:49:29
Speaker
Where it's like, if you bring that movie up, people have to kind of like... It does. Yeah, right? And it's not, like, considered a classic, really, when it was something like that would have been, right? Where now it's it comes with that asterisk, where it's like, I really enjoyed it. I really... But then you watch and it's fucking poof from a butt. You know, it's a bad movie.
00:49:48
Speaker
is It's not... great i mean, I don't hate it, hate but it's pretty underweight. And I love David Desmalch, and, like, I was, like, kind of wanting him to have, like... vehicle to show off his his sounds but like that that's the that it happened to be in that movie no yeah so I agree that like even if you know we say that we're not you know it might be spread out pockets of people saying they're not gonna watch things sure that is that gonna actually immediately affect day 24's bottom line no but
00:50:21
Speaker
yeah moving the conversation in the right direction. And it can kind of give a sense to, I feel like there are still filmmakers and talent that are on the fence, maybe, who maybe either just don't understand fully the, even though they, you know, well, not the directors in Strikeforce, but like, you know, even the actors struck for it, they should understand what they're up against. But some of them, I feel like are still kind of like, well, I don't know. Are people into this? Is this like a thing that people want? Yeah.
00:50:49
Speaker
Um, so if, if they see more and more that, no, this isn't, I feel like that that can create some kind of momentum. I mean, we, there's already been like Ari, asked her has been like, no, not into AI at all. He's referred to the way that people, uh,
00:51:08
Speaker
view it as like, like they talk about it, like it's like a God or something, you know, like that did it that he, and it is true. That it is how the tech, those tech guys do talk about it. um One of the, I think is Daniel Kwan, one of, one of the directors of everything everywhere. Also hard line does says that has no place in the industry. So those, and those are, I mean, everything everywhere, one best picture, sure you know, like those ours are, and did they win best director that year? I think they they did. They did.
00:51:37
Speaker
Yeah. Yeah, so those are Academy Award winning, you know, to talent is ah saying no to this. We need more of that. And I think the way to encourage that is from our end. And like if we keep making us think about we can't let this like the like in not just online, but in general, the way cycles of news works, like things just could blow seem to blow over so quickly.
00:52:03
Speaker
of like Because there's just so much bullshit happening that it's hard to like stay focused on any you one thing. But like i to me, this is just so important to me because film is like you know something I live and breathe. So like I don't want so this this this ah art form to be bastardized for just for the sake of...
00:52:26
Speaker
Like we said, like just some quick ah get rich quick schemes. And it's that that's not art. who Well, the the biggest pushback to this conversation that anyone could bring is just principled response. Right. Like principled ideology and making sure it's consistent. Right.
00:52:43
Speaker
And the the the good thing about being principled is that when people can tell that you mean it, it comes off as really cool, as really rad, right? When you actually stand for something and you don't care, like, and you're consistent, right?
00:52:57
Speaker
ah Other people can feel that passion and they join you along with it, right? And that's why these AI conversations are so interesting every time they come up, right? Because even though, like you said, you know, there's this endless onslaught of just bad news, know, like, JD Vance did what? You know, all that stuff, right?
00:53:13
Speaker
But like when the consistent ah conversation around something like AI is just every time so it's brought up, someone goes, wow, this sucks, right? That immediately is doing everything you would ever want, you know? Like that is the perfect propaganda against it, right? And as long as that message stays consistent, as long as there aren't people who get wrapped up in it and they're like,
00:53:37
Speaker
I was a skeptic before, but this changed my mind because it was made by Terrence Malick, right? Like that's that's the thing that needs to be consistent, right? It's like you have to leave your perspective of an artist at the door and you have to recognize that an artist themselves are capable of real world harm in a political active sense, right?
00:53:57
Speaker
Because again, that's what this is. um and And when it comes to this... ah yeah you go ahead Yeah, no, I agree. I agree with with all that. We need to stay consistent, even if it's our faves doing it and and just keep our keep our eye on the prize. that The main thing I'm worried about of like, yes, I do see the ah tide turning. Like I know online isn't fully representative of everyone, but the comment section things are increasingly like, I don't want this. Like that looks bad for for AI stuff.
00:54:30
Speaker
But I feel like people can be easily swayed by like a shiny new toy and like long, there's been ah like a long dangled promise of like, well, this will give you the viewer, the, the storytelling, uh, the power of like, you want to see this version of this, you can do it with, with, with AI. If there's like,
00:54:52
Speaker
becomes like an on-demand version of that on Netflix, people will start playing with it in posting their version like, hey, I did... This is my Stranger Things ending, but ah hopefully that that's just a fad of like people just play with it for a bit, like when there's a new base swap app, then people post that, but then move on. So i i really don't... i i I want to believe that that's not going to be enough to like actually...
00:55:19
Speaker
distract people, ah but I don't know, people can be surprised at how dumb they are. So, yeah, I hope that's not the case. I mean, i remember there was some quote by Ben Affleck when he was talking, because all these conversations are just the potential. It's never on what the technology can actually do now. in And he was saying, like, yeah, i mean, i don't know, the idea of, like, ah i after I watched the session, like, oh, well, what if there's an ending where...
00:55:46
Speaker
you know Ken goes off with Stewie at the end or something. I can can then see that with with AI. It's funny that that's that's the version of Secession that's like his what if. He's like, I really want to know Kendall his bro.
00:56:02
Speaker
and They're good. and They live happily ever after. um by That sounds horrible. ah like why would the i hope that i I hope no one's actually like swayed by that. like i Like I said, I think it's just like a shiny ah shiny toy. But I don't know. Some people do seem like they can get distracted by keys jangling in their face. So I'm i'm just saying like we you need we need to stay strong and focus on what the goal end goal is here.
00:56:33
Speaker
Doug, I've got a question for you. Have you ever met a person on the street and you asked them what their favorite book was and they gave you a choose your own adventure novel? No. Right. And no one's favorite episode of Black Mirror is Banner Sanch, right? That's a fun episode, though, but it's not my favorite. No.
00:56:51
Speaker
No. Right. But the the the thing, like you said, it was fun. Right. You toyed around with that. Right. It existed as a platform more than a project. Right. And you saw the possibilities.
00:57:04
Speaker
The thing with that project was that there weren't more of them. Right. There are not like Wednesday versions of Bender's Dash. Right. If that format was so successful, we would would have seen more of that.
00:57:14
Speaker
Right. And this goes into this A.I. conversation where it's like, imagine you're a blue collar worker, Joe Schmo. You come home from work. Right. The last thing you want to do to watch your entertainment is type it in.
00:57:27
Speaker
Make choices there. i've yeah It's already hard enough to choose the thing to watch. So then if you're go to tell me of like, oh I clicked on this program. Now I need to tell you the story I want to see. No, make the fucking.
00:57:42
Speaker
That's what I pay you for. Yeah. when when when When the layman comes home from work, they they turn on their box, they call it, right? And they they flip through a bunch of posters, right?
00:57:52
Speaker
They stop when they see a celebrity that they enjoy because they liked them in high school, appears. Ooh, Ryan Reynolds. Okay. Yeah. There you go, right? Boom, they're done, right? And that's that's the thing thought process that goes into it, right?
00:58:05
Speaker
When you sell somebody on this idea that you have complete control over your entertainment, right? The problem is is that they're not going to take you up on that offer. They don't want complete control. what they you what Audiences literally don't know what they want until they see it, right? And that extends into this conversation of AI creation in general.
00:58:23
Speaker
Because while many people see themselves as like, amateur film critics or amateur like film executives in the conversation, right? If you give them the tools to create something, they may not actually know where to start because they don't understand what the creative process entails, right?
00:58:38
Speaker
And this goes into that conversation where it's like they imagine that they want to make this project, they make it, and then they watch it and they go, well, that kind of sucked. That was a bad idea. Do you think they're going to feel incentivized to make another project?
00:58:50
Speaker
No, they're going to pray go back to Die Hard because Die Hard's awesome every time you watch it, right? So it's like, the the the The value really doesn't, like, just from just thinking it out logically, right? The value doesn't make any sense. It only makes sense from the from the seller's perspective that it would make that audiences would want that.
00:59:11
Speaker
But from a consumer standpoint, there just is no actual utility for a tool like that. There's no utility, and it just seems very dubious that that, like, a workable...
00:59:22
Speaker
like consistent version of that would even be usable anytime like the this this this day like we you talked about how the chat GPT upgrade was like ah underwhelming and it's like I think this stuff is like so much more basic than than they're trying to say because like I said they they're they're selling it so much on like eat eat like last month there's like a ah ah tech guy on 60 minutes and he's talking about like oh yeah like within like I don't know, like 10, 15 years, we could cure cancer with, you know, the calculation because of something about the calculations AI could do or some bullshit. Like, get the it can't even do hands.
01:00:03
Speaker
It can't even, like, help distinguish between the foreground and background and an in an image for a wicked for good poster. So, like, why are you telling me that it's going to cure diseases? i just, I don't believe you. don't believe you.
01:00:18
Speaker
No, it's they're lying. Right. And then also on the note of like just AI films. Right. Watching an AI short film. Why are they always montages? Why are they always like slideshows? Right.
01:00:31
Speaker
And it's because they like you look like the tech is just not there for them to do like conversational sequences. Right. You can't like base a film like traditional structure of shot reverse shot around AI at this point, because literally the the the program is guessing what each image is going to look like. So if you were going to make a shot reverse shot in AI, you get wildly completely different perspectives just based on what the AI in the moment creates. Right.
01:00:56
Speaker
So that's where a lot of that ironing out would have to come in from, from that, you know, like creative lead that's overseeing this, right? And then when you see what these short films are, they are these uninspired montages where it's like a voiceover narration shows a bunch of like well-filmed and quotation shots.
01:01:13
Speaker
And the problem is, is that The filmmaking itself, when you're watching a cool sequence and it's shot really well, right? ah There's a metatextual layer to the audience where they go, wow, I didn't know the camera can do that, right?
01:01:25
Speaker
If you know in your mind that the camera doesn't exist, if you know that the camera can just float wherever it wants because it was made in a machine, it's a lot less impressive. It's why, like, video games like God of War, right, where it's like the entire game is shot as if it's ah from the Warner perspective, right?
01:01:42
Speaker
the the get The camera never cuts is the idea there, right? That is impressive from an immersion standpoint, but at the same time, it doesn't add anything from a storytelling standpoint, right? So that's just an aesthetic thing, right? And it's a it's actually something that detaches you from the reality of the world rather than immerse you into it.
01:02:00
Speaker
And that's what I'm getting at here, where it's like, it doesn't matter what the camera can do with AI, because of the format itself, there is an inherent like detachment that the audience is always going to have to wrestle with.
01:02:11
Speaker
And that's something that the people who are in charge of this tech just aren't coming to terms with. Yeah, I agree. I mean, that's for for many reasons. That's why a i will say mean, because like at the end of the day, it's just an overhyped plagiarism machine like it can't it can't think for itself. It can't create anything new.
01:02:30
Speaker
And it's, it's ah that's why all so many initial AI things were like, oh, X thing as like in the, as a medieval fan, like it's just remixing. Someone actually put the time to do it. Like I actually have seen someone do, do an edit like that. I'm like, well, hey, I don't want to see that actual loopy, but this was made by a person, you know, the, the, fueled by just this this person's ah passion and the hours they put into into editing this.
01:02:59
Speaker
And that alone gives it more inherent value than any AI thing could ever have. Totally, right? Like, on that note, right? Like, I think one of the best edits I've ever seen was, like, they took Breaking Bad and they put it in Mario Kart, right?
01:03:14
Speaker
Oh, on Rainbow Road? it like, the shots were... Yeah, exactly. Yeah. And like, it's so well done. Right. And it's just it's through simple like editing techniques. I could tell you what they were, but it would be far too boring.
01:03:26
Speaker
and And the thing is, is like that itself in the creation of that has more artistic merit than any AI project, because even from the audience's standpoint, like you had just said, right, with that Pattinson, Batman and Cornspot Superman, right?
01:03:41
Speaker
You're watching that and you go like, There's a character to that. There's a charm to that. And there's a human fingerprint on that. And also just on the note of like AI creation in its merits in that sense, right?
01:03:53
Speaker
um How many fake movie trailers there are for like Adam Sandler in The Simpsons, right? As Homer, right? And that kind of sucks, right? But then there's that classic example of like Homer singing ah Born Slippy from Trainspotting and he's surrounded by all of those AI models dancing, right?
01:04:12
Speaker
The reason that people like that clip, right, where it's literally just bow-coated Homer singing with preset dance movements, Everything about that clip is so artificial from the timing of the edits to the dancing of the people in the crowd to the singing of Homer.
01:04:30
Speaker
The reason that people like that edit is because it is so inherently robotic because it is of a machine and it doesn't try to hide that. Right. You then compare that to the AI Simpsons thing. Right.
01:04:42
Speaker
And people are grotesquely opposed that because immediately you can tell something's wrong. The Uncanny Valley is certainly in effect when you see like home Homer-ish with Adam Sandler's face and he's going yeah like in a robotic voice like, oh, forgot to take Maggie to daycare, right? Like there's no, like because there's that artificiality and it's trying to pass itself off as reality, it's never going to reach that level because of like the inherent format. Even if like the eye cannot capture ah the difference, even if like you just see that as reality,
01:05:16
Speaker
The nature of its construction is so robotic and you can see the wires to such a degree to where it's so alienating. and Yeah, people will just in intrinsically, instinctually feel off watching it, even if they can't clock that this, you know, wasn't made by a person. And that's just...
01:05:34
Speaker
that's just a an intrinsic failing of it like it well because they're so obsessed with chasing uh these recreations of of reality i'm not saying that like pushing that take in a different way would be better but but they've they pigeonhole themselves by being like no look and what we can you know recreate with this one it's like why is that your end goal anyway mean reality you know sucks Why are you trying to recreate that?
01:06:05
Speaker
Why is that? Why is that? That shouldn't be an our artistic end goal, you know, like yeah the art in film can surpass what's capable in in reality. That's like the magic of it.
01:06:17
Speaker
Well, like, this is a great way to segue into like those recently viral AI videos where it's like people from like the 80s or 90s talking about like, oh, well, things were much better back and then because you had to live in the moment. Right.
01:06:31
Speaker
And like suspiciously, everybody is like really white, you know, and like they're they're just like talking about like getting back to some kind of greatness. Right. And then it all starts to click into place. Propaganda. Yeah. Yeah. Like that. That's just like the return, you know, return with a V mindset that like and trying to capitalize on this tech to be like they're they're they're they're creating a vision of a time that never existed.
01:06:58
Speaker
that I mean, they're saying it's the 90s or the 80s or what ah the 1950s, a decade of nostalgia here. But what you're seeing is not actually that decade because like none of and it was never the way that people have envisioned any of this. They're just trying to, you know, they're trying to sell this idea of, you know,
01:07:16
Speaker
so a white utopia. You know, like that that's that's what they want. They want like a white ethnostate and, you know, to suppress the rights of people. But they want to make it put in a package of like it being some wholesome thing that we strayed from of like, ah, things were so simple. If you went back to the 80s and you were like, here is my phone and look at all the porn that you can just get like this, like so quick, you know, like all of those people looking directly into the camera saying, come back, they're going to turn change their tune real quick.
01:07:43
Speaker
You know, they're like, well, take me the future. You know, let's invoke Woody Allen. You know, that's the whole conversation in Midnight in Paris. Right. Like the the idea that like your idea of what the past is or even the idea of what reality is. Right. It's not going to be the same as your experiences within the time that you grew up when because you yourself are a product of the environment that you are brought up within.
01:08:05
Speaker
Bring that back to AI. Right. Right. Like, yeah even if let's say we're talking about AI from the concept of, ok let's do a a scene in a mansion and it's spooky, right?
01:08:17
Speaker
Like the mansion itself will feel unreal because it doesn't like, it only has the idea of a mansion rather than what an actual mansion feels like. It'll have the idea of what ah the mansion lighting should be, but that's not exactly what it's like, right?
01:08:30
Speaker
And then they'll have the humans, what their idea of those humans and what they should look like, and it's not exactly quite there. The problem keeps going even if the tech gets one-to-one. The problem is there still needs to be something that's filtering all of this through.
01:08:43
Speaker
It's a prediction of an extraction because, like, a dream's... I mean, films are already dreams. You know, they're not real. Like even even ah documentary which is showing you ah some real event or people is, you know, that's a narrative that that filmmaker is is put in that context. So then in then to breath take keep adding layers of removal from it, then that's just going to be hollow. when you go and look for movies that are related to other movies, right?
01:09:15
Speaker
You look for things that feel similar, but aren't one-to-one copies, right? Like, let's say you really like Blue Velvet, right? You're not going to then go and watch a bunch of Blue Velvet knockoffs, Right. You're going to watch like neo-noirs that come from different perspectives that have their own things to say, but you still like them because there is some either thematic, stylistic echo that comes from Blue Velvet or at least it exists within the same grammar. Right.
01:09:41
Speaker
AI can't do that because it is a literal copy. Right. So it's going to attempt to recreate those things, you know, like um a fake Mona Lisa rather than like, you know, and an entirely new painting.
01:09:54
Speaker
And that's where the inherent flaw comes in. Right. um Yeah. Yeah. I mean, we're saying the same thing. And I i just hope everyone walks away from this episode. Just, you know, I don't I don't need everyone to be yelling. I'm not I'm not Howard Beale network. I'm not saying go go to your window and yell. and I'm mad as hell. I'm not going to take it. Although I might feel pretty good.
01:10:20
Speaker
Maybe try it. Maybe maybe your window. Maybe maybe it would be good. ah One thing I will say, one thing one thing I will say is not, ah you know, network, I'll say from the Irishman, Al Pacino, solidarity. We need solidarity in this conversation. And that's something that I see missing often in the in this AI conversation because- especially within the people community like like yes that i'd we there needs to be even if we're not all you know boycotting or skipping ah every a24 movie i think we need to be together in how unacceptable this is because if we start like so shrugging and letting you know some over the plate of like well you know, that this example is all right or wasn't this bad and we've already had this and that and that. And it's like, then you will we've already lost a for that's if that's going to be the mindset.
01:11:17
Speaker
that that That is the the major crux and flaw I see from ah people online is they let their nihilism get in the way of these conversations where they... they The thing is, is that people think that they are powerless, but they're not, right?
01:11:31
Speaker
They have only been conditioned into thinking they're powerless because of the system that's overhead, right? And the the reality is, is that if we live in a society that's entirely ah propagated by corporate interests and the dollar is the bottom line, right? You need to do everything in your power to make sure they make as little money as possible.
01:11:48
Speaker
Right. And in doing that, you need to make sure that we are aligned all in the same goals of making sure that everyone hates the shit. You know, a simple A24 boycott. Right.
01:11:58
Speaker
Is a great way to start. Right. But we need to make sure that that principle carries over to every filmmaker, to every studio, because, again, it's not about the individual being the problem. It's about the greater systemic issue being a thing.
01:12:12
Speaker
ah Better Man. I loved Better Man when that movie came out. What I stopped talking about the moment that I heard it was that that AI was used in the creation of the voices. Right. And that's the way that we haven't. I still had never seen Better Man. I didn't I didn't know that about the the voices in there.
01:12:29
Speaker
Yeah. And that's the thing. It's not always sometime like, yes, that we all talked about late night with the devil, but some of these sneak by of like that. I really I did not see any. And people love that movie. And do you think I would have heard as many times as I've seen that that one song coasted on my timeline that there would be people bringing up but like, hey, they used AI voices. What the fuck? And the the problem is, is that most people in film discussion places would rather take the, well, what what are we going to do about it place? Because it affirms their comfortability and no action, right?
01:13:07
Speaker
And the reality is, is like all they need to do is take the extra step and say, I reject this, right? Because the moment that you say, I reject this, right? That itself is a stance and you still did nothing.

Criticism and Media Literacy in AI Discourse

01:13:17
Speaker
You still just typed it out on your phone, right? And you can still let... individual artists or thing like it's like not even so i'm not even saying you can't like a better man but if you you actually believe that this is bad you should be saying ah every time you see it the the calling it out so like it like you said that's like requires nothing like he it's it's just like just a consistency of characters really all we're asking here
01:13:48
Speaker
And then ah the conversation extends into talking to other people who are fighting against AI, right? If you are spending your time entering conversations where somebody is talking about how bad AI is and you're telling them to calm down, you're telling them to like, you know, I think you've gone a little too far in this conversation, right?
01:14:08
Speaker
What are you doing? You're helping the corporate interest in saying that, right? You you are of now moved the conversation in a different realm and you are not ah rectifying with what the original sentiment was.
01:14:21
Speaker
And I think that when people recognize that that urge in their brain to play devil's advocate can sometimes ah feed into the exact forces that they are supposedly against, the moment that people will have I'm sorry to use this term, better media literacy. Well, that's, in that's just missing in, in something that we, we need like, uh, it, you know, to really drive home the that point and, and everything we've been talking about. Maybe I just had this movie on my mind cause I was listening to another ah podcast about it, but I also think, you know, it's relevant, you know, it's, it's Labor Day or around Labor Day. People are going listening to this, uh,
01:15:00
Speaker
this This movie and filmmaker are definitely ah concerned with ah systems of capital and and labor. and in In the film, They Live, ah the aliens use the money to control the humans. And that's it's like...
01:15:16
Speaker
ah there's literally just a shot of like, you know, when he has the glasses on, he looks at the dollar says this, this is your God. But the most insidious thing in that movie are the human collaborate kirk collaborators who, who always fall to the excuse of like,
01:15:33
Speaker
well, we're not going to get rid of them. We're not beating them. Like, so why, why can't I just live comfortably here? Like that, like they're willing to kill and betray other humans just so they can maintain their status quo.
01:15:47
Speaker
And, and, uh, that's, that's what you're doing when you're saying, when you're telling people who are anti AI to calm down, you're fucking, you're a fucking collaborator, man.
01:16:00
Speaker
and then Well, also, let's take that a step further. Right. Because like sometimes people in those positions, they either see themselves already as or are attempting to be like serious film critics or journalists. Right.
01:16:13
Speaker
And in their minds, they see as speaking out on this issue. as perhaps ah getting in the way of an opportunity down the line, right? And at that point, you're just a collaborator, like you said, right?
01:16:24
Speaker
You're staying silent because you don't know if somebody you admire personally is going to be in a position where they're going to be doing something that you yourself are at odds with, right? And if in that physician You say, okay, well, I'm going to let this slide because I really like X filmmaker and I want to have the scoop or I want to support their film, even if they've done this thing I personally disagree with.
01:16:46
Speaker
At that point, you are just like you have become an extension of that capitalist machine. You have just fed into the corporate interest of just like marketing. further flattening this whole conversation and this whole ecosystem, right?
01:17:00
Speaker
So that's why, like, principled stances are important here. If you would just take an absolute, ah you know, suck approach, you will raise those other artists to your level.
01:17:13
Speaker
You don't have to sink to their level. You don't have to dig play in the mud with these people because you want to get something out of them, right? Get them to race to your level and you'll actually improve their art. You'll improve the conversations around it and everyone will win from that. That is a ah no bad outcome scenario.
01:17:31
Speaker
yeah I mean, and what what else do you want? Like, i want don't you want the best art possible and the best environment for that kind of art to thrive? And if so, then this is what it she requires. And it's really just...
01:17:47
Speaker
It's just asking for the money. I'm not asking you to go the throw Molotov cocktails through A24's windows. Yeah, maybe maybe that's that time. I'm not, you know, signing any of the violence at this time. But, you know, I'm.
01:18:04
Speaker
My Antifa membership is in the mail. You know, they're they're on a jet on their way here sending me some supplies. Where did you think they came from? Canada. but but but that's where the jet came from. Ah, now understand that.
01:18:18
Speaker
The deep money interests of Canada. This goes back to all these other political conversations in these films we've been having. Like, it's it's the gold gold, solid gold Magikarp. These companies are wanting us to to turn on each other, waste time doing that. And then it meanwhile, the house always wins, you know, like that they that they just get to sit back and except Except the status quo that benefits them. Well, hey, we don't have to take that.
01:18:48
Speaker
Even if it's just saying I don't like this. Well, let's be honest about this conversation, right? The reason this A24 article contraction was because I quote retreated it.
01:18:59
Speaker
because I called it stinky and dumb and it got a bunch of likes, right? And it was on Twitter for six hours at that point. And it had like barely any, like it had like 40 likes on the tweet when I had done that. It had like...
01:19:14
Speaker
three d retweets, right? And the thing is, is that on that same day, everyone was arguing about Eddington. Everyone was talking about like how, like all of these things were wrong with Eddington and stuff.
01:19:26
Speaker
And it's like that, as you said, is what that film is about. Like the problem is, is that we are getting into these semantic arguments about like who is worthy, who is who is the king of filmmaking. Right. Is this person a secret fraud?
01:19:39
Speaker
Right. And the reason that I don't like that kind of language, the reason I don't like those conversations is because it's just. finger pointing in the wrong directions. I don't think that like secretly calling, like pointing out Ari Aster secretly a bad filmmaker is productive when you have the industry collapsing due to like corporate concerns. It's like, what what are we doing here? Like at that point, you're just working for age twenty four You're like going, yeah, this guy deserves less, right? And totally supporting their end goal, right?
01:20:10
Speaker
We have to remember like the artist needs to be protected. artist needs to be protected, even if they are extremely privileged one. Right. We need to make sure that an artist has their methods in which they can tell their stories.
01:20:23
Speaker
And if someone like an Ari Aster is one of the few who is making things, the reason we protect the him is so others can also get their projects made. If we just keep on denigrating those figures every time they come up, we're just going to leave the studios with all the power.
01:20:37
Speaker
So like. We really got to think about what our actions are doing here, folks, because like when we are just like talking about these people as endless frauds every time they make something new, every time there's an original filmmaker, guess what?
01:20:48
Speaker
You're just helping the studio. No, I actually think that Oz Perkins is the problem. yeah People can't see my face right now, but I'm like kind of rolling my eyes isn and making a ah a jerk off mo motion because ive I saw someone complaining that like, ah you know, like, oh, filmmakers like John Waters or, you know, Todd Haynes have trouble getting films made. But meanwhile, Oz Perkins is just...
01:21:16
Speaker
blow his nose and another movie comes out and it's like yeah because he makes horror films that cost nothing you know like his movies are cheap money they could constantly make a profit those other directors are not in that genre and those movies are harder to fund i mean yes there's other external problems like yeah it doesn't help when you know you got you're walking phoenix walking off your detective movie but hey that movie may be maybe a go now maybe Pedro Pascal has revived that one so like that's that's the examples of yo know people who do use their their status to push those things over the line which is heartening and it's like we should be doing everything we can to help those kind of movies and those filmmakers like even from our perspective as the viewer instead of just being like yeah I'm not saying you i'm must saying like you have to like all their movies. You can not like Oz Perkins films. That's fine. You can not like any movie. You can not like every movie you've ever seen. I mean, I may have questions for like what you want to do in a movie if you're just like, these are all bad.
01:22:23
Speaker
I've never seen anything good. but well ah but but But to to just act like that he is the problem and instead of the the environment in which these films struggle to thrive and get made is the problem, which is like not caused by anything Oz is doing. If you wanted to talk about what Oscar Perkins is doing as a bad thing, you would talk about his new film coming up that he shot in Canada because of the writer's film, right?
01:22:48
Speaker
Like that is a form of scabbing, right? What he did. And the thing is, is that... ah He gets a pass because he had these passes from these studios. But like, what did he do? he just scabbed. Right. And he got permission to do it. Right.
01:23:01
Speaker
And the thing is, is that ah when you base your fraud watch complaints around ah like subjective elements of a film, your footing is rocky, you know, because you're assuming that other people view your opinion as objective fact.
01:23:15
Speaker
Right. So when you get into these arguments of, you know, like Oscar Perkins sucks because the monkey, you know, that doesn't hold as much weight as Oscar Perkins sucks because he made a movie in Canada to subvert like the writer's strike, you know?
01:23:28
Speaker
Yeah, but that's an actual issue. Yeah. Yeah. And I think that so many people, ah there they they have trained themselves into equating like their own opinions on movies as like objective reality.

Focusing Criticism on the Industry

01:23:43
Speaker
And that's the thing we have to nip in the butt where it's like, if we want to call people fraud watches, if we want to, you know, call people hacks and stuff, let's look at their actions. Let's look at who they're working with. Let's look at what the, who's funding their projects. Right.
01:23:56
Speaker
Those will, because like you said, you could hate all of those movies that Oscar Perkins made or Robert Eggers or who have what have you, right? But the the thing is, is that you like, when you look at the grander scheme of what is causing the problems in the industry, they're small potatoes.
01:24:12
Speaker
they if If you see them as bad, they are just a reflection of the industry that they were born within. So why not go after the people who made the industry that way, rather than the people who are products of that and industry? Yeah.
01:24:24
Speaker
Or just trying to get look at what they can through that, you know, broken and flawed system. Like, that's that's just like, ah i don't know how you can can fault them for that.
01:24:34
Speaker
And while we're on the topic of of filmmakers and these labor issues, I do want to make a... ah Before, I think, you know, when we had recorded weapons, we were talking about ah the Park Chan look scabbing and like, what what what what was the deal with it with that?

Controversies and Complexities in Filmmaking

01:24:50
Speaker
I mean, we there wasn't everything. was The facts weren't all out at the time, you know, as well as the dude would say, new shit has come to light. ah that said that that then i I mean, it's it's it's a fuzzier thing because like he is...
01:25:04
Speaker
a writer, but also a director in both of those guilds have different stipulations about what constitutes what. So I think what had happened was that, wasn't he just like editing sympathizer, like it had had already been shot or something.
01:25:21
Speaker
And by DGA guidelines, that's fine. Like doing post like editing stuff, but WGA is that's not only not allowed, but banned. You're you fucking out.
01:25:36
Speaker
You're done. You're done. like the The conversation we had was funny. like I stand by when I said, you know, the people don't need to sympathize that quickly, you know? You could have taken a minute minute to wait, you know? Right. but But ah that's speaking to like the absurd like ways that these rules overlap, and he understood that he broke the rule, and he he took the the firing from the position.
01:26:00
Speaker
Right? and And you know what? the The thing is, is like he did all he could in that position, right? ah the The problem, the rub in in all of this is literally that push to create, right? Like this idea that these things have to come out right away. They need to be like in front of everyone's eyeballs as soon as possible. And like, even if he had the best of intentions in mind, you know, there is a strike going on. He didn't need to be editing and that. and He could have shown solidarity by not working on it, right?
01:26:28
Speaker
So like... people's, ah you know, immediate admonishment of him being like, oh, yeah, well, like he was within his rights to do that. At the same time, he didn't have to be working on it. He didn't have to be pushing that forward through the cross, the finish line because there was nobody else working on their projects, right?
01:26:44
Speaker
and Yeah, like my my angle goal is not to say that Park Chan-wook is a scab or anything. I'm just saying like the rules need to be more clear, A and B. right The artist themselves ah sometimes feels emboldened to push forward when they don't need to, you know, and and that was a case where they didn't need to. They could have just waited and they would have been fine.
01:27:06
Speaker
Yeah, he couldve he could have waited. i mean, that's, I mean, he did, to his knowledge, it seems like he didn't even understand that he was violating anything at at that time.
01:27:17
Speaker
So, you know, like, it's, yeah I can't fault him for for that. I also do feel like there is a mentality of, like, a lot of these guys just need to be working, like, on something. like Like, I think about, like, what the fuck did...
01:27:31
Speaker
Tom Cruise do when when whenever a production shuts down. Like, does he not, I don't think he, like, lives when a movie is not being made or he's not talking about a movie being made. though like, the does he just, like, power down, kind of like a robot?
01:27:47
Speaker
He goes to a docking station and they're kind of like, for they're like charging stations. That's why there's Scientology churches in all major cities in case Tom Cruise comes in. Yeah, because he goes everywhere so he needs to be able to charge no matter what city he's in Precisely. Especially Tel Aviv.
01:28:02
Speaker
Hey, he... I don't know what I was even going to say. was going Last Mission Impossible said we need to disarm Israel. They showed those nukes. They were like, hey.
01:28:13
Speaker
That's true. Let's get rid of them. Which is funny that that's like a thing that... no one acknowledges that, like, because they're not supposed to ah have nukes. I feel like people don't know that, but that's, like, not a thing. But we all know they have them. Why isn't Iran allowed to have nukes? That's that's ah that's a topic for another podcast.
01:28:37
Speaker
We're going to solve the Middle East, one of these podcasts. I don't know which which movie will do it the in the pretense of of discussing... ah What's that? Three Kings. Yeah, I was doing Three creek Kings or probably and another George Clooney Middle East movie. What's the... Oh, yeah. Syriana.
01:28:54
Speaker
ah Yeah. Yeah, we could do we could do that one. You know what one I always think about? I always think about The Kingdom with Jamie Foxx. you ever see that one? I saw it, but I couldn't tell you other than i remember it being ah Peter Berg movie. So they're like the action and everything in that was Bergy as fuck. But if I had to make a guess, I would say the politics of that movie, probably not great. Like if I were to visit it, you know, yeah, I, that that's just, that's probably not great in that regard. I,
01:29:30
Speaker
It used to be like a really dumb, fun action movie I would throw on when I was bored. But the older I got, the more I was like, wait a second, this is total bullshit. Yeah, like, wait, what's going on here?
01:29:40
Speaker
ah What am I polluting my brain with? Oh, no. um Yeah, ah this ah this conversation... ah ah So another thing that should be pointed out is like some people in this conversation feel as though their content is a marker of their words.
01:29:58
Speaker
You know, ah some people they approach, um I only watch good movies, right? Therefore, if I watch a good movie with someone who made AI, right, I'm just being objective, right?
01:30:11
Speaker
And I think that that is another thought process people need to kill off in their brain, you know? Because like at that point, they're just like we said ah countless times and now, they're just creating that merit for the film. Right.
01:30:24
Speaker
But it's like there is that instinct where people think that all ideas are equal, that all ideas are worth like considering. Right. But the reality is, is that's just simply not true. Yeah. Yeah, I agree with all that, even though the the overall thesis of this discussion is like, hey, let's make a principled stand on this. I also would say a lot of film fans listening, don't take yourself so seriously. Maybe you don't need every movie you watch doesn't need to be good. Watch some bad movie.
01:30:52
Speaker
They're actually, you know. who Who gives a shit? You don't have to. That doesn't have to be a thing where you're like oh, no, this wasn't, you know, a five star masterpiece. i Oh, no, I accidentally watched something that wasn't Kino. I need to, you know, like a tone.
01:31:07
Speaker
That's fine. You know, it's you could slop it up every now and then. Like, I'll roll around some mud. It's fine. You don't have to watch the the John Ford Westerns again. You know, you've already seen them, right? You can watch some bullshit every once in a while, right? And the thing is, is that is what it takes to be a well-rounded viewer of movies, right? To to get in the mud, as you said, to enjoy that crap.
01:31:32
Speaker
Okay. I think that makes you more well-rounded in terms of like understanding what makes a good movie as well as what makes a bad movie, right? And then also like, like you said, you don't have to take yourself so seriously, right? If everything you watch is somebody's favorite movie, right?
01:31:47
Speaker
Then you stop interacting with all of them as precious, right? You stop perceiving them all as these special pieces of art, but rather all of these are in quotation special pieces of art that I'm consuming, right?
01:32:01
Speaker
So you need to have a delineation point. You need to have a ah point where you can say, like, I know that I enjoy this for this reasons, but and it's good. And then I also enjoy this for these reasons, and I know it's bad, right?
01:32:14
Speaker
there is ah It's okay to have that distinction and to enjoy both of those things, right? It um And I'm not saying this because I don't think that people ah don't don't i don't think that people already have problems doing that.
01:32:27
Speaker
But rather, I think that some people put too much importance on when they are watching great films, right? In the sense that they are going, well, i ah because I consumed this, I am doing my part.
01:32:39
Speaker
You know, some people approach it that way. And I think that's inherently more important than like, yeah. than watching like a Jason Statham movie and it's like no those are both films and especially Jason Statham you might miss out on some secret masterpieces if you if you just like all your we' do watching a claim that they you know you might not know that Crank and Crank High Voltage are actually masterpieces like ah that they those actually you know that's avant-garde filmmaking you know
01:33:10
Speaker
and There is daring, inventive filmmaking in the Crank series, right? Like, it's because it's a those are filmmakers who are in a like have no bounds in what they can put on the screen, right?
01:33:21
Speaker
And that's what it means to break the rules. a Crank, right? Rather than breaking labor rules or, like, you know, creating an unsafe set environment, right? And and that's there's a reason why I drill. That's Renegade filmmaking.
01:33:36
Speaker
exact That is renegade filmmaking. And and there there's a reason why I say, like, you know, a problem on set and AI are of the same thing because they represent the same existential fear, right?
01:33:49
Speaker
It's taking advantage in a, you know, ah worker's rights sense, you know? And that's where this conversation should always be centered around, right? And that's why ah lack of solidarity trying to say, like, who is the right way of approaching this and all that stuff.
01:34:05
Speaker
Everyone's got their part to play. and real ex As long as you're not fighting against each other, you know, trying to make ah trying to make these grandstandy like I'm doing it the right way, you know, I think that then or in the inverse of saying like, I don't like the way that you're doing this, right?
01:34:19
Speaker
And you are, um you know, not being productive, you know. that That's infighting in a way that just doesn't move the conversation forward in any meaningful way. Be specific in what use cases you're talking about.
01:34:31
Speaker
Be specific in why you think that's wrong. Continue the conversation in terms of just saying why it sucks. That's that's all we're asking. I mean, that's, you know, it's it's it's it's it's Labor Day weekend and but this is all, it comes down to labor. Like, that the the none of these feats of or feats of, you know,
01:34:50
Speaker
of of artistry would be ah possible without the labor put in by everyone involved. And like, that's like, that's why this issue matters because all those people matter.

Artistic Insights into Labor Issues

01:35:03
Speaker
And if you, if you still don't understand labor issues, you could watch, watch, watch Michael Mann's thief. That'll tell you everything you need to know. Watch the insider. That'll help too.
01:35:14
Speaker
Yeah, just watch Michael Mann in general. He's he gets it. He comes at it from like a a way that's like ah subjectively cool, but like also objectively cool. So so you can get radicalized also just sitting in your sunglasses, smoking a big ogi is like, hey, what if Marxism also was neon lit? And I'm like, hey what if that sounds fucking cool?
01:35:34
Speaker
Now you're speaking my language, buddy. Like, yeah. All right. Well, yeah. I hope everyone got something out of this conversation and it you enjoyed hearing us yell yell at the clouds, the AI clouds.
01:35:49
Speaker
ah One thing before we end this episode. Free Uncle Runk. Runk. ah Freedom. I don't, and ah you know, i so certain records, i feel like, you know, when I started that that other side feed, I was like, yeah, like that could be where all the major film Twitter discourse have. I don't know.
01:36:10
Speaker
regular person shouldn't even need to know any of this. there's a stuff No one needs to know about weapons being transphobic. Right. But I think Runk, you know, there's this account, Uncle Runk, who he he's a filmmaker in and of himself. So like the fact that he ah he's coming at this from a very sincere place of like, this is unacceptable and we should all be uniting it against this. And i really don't know what he said that pissed people off other than
01:36:44
Speaker
you know, pointing out people's apathy, you know, like, because like, you there was a lot of responses to he was saying yeah he made a post of like, yeah, there's all these articles about late night with the the devil, but no one's talking about the A24 AI stuff. And some people are like, well, that's problem with your timeline or no one's talking about it. I've seen people talking about it.
01:37:04
Speaker
I think he's talking about, that's just a disingenuous reply because like, ah he's clearly talking about in terms of like the press about it. Because like that, we there's one article from this New Yorker profile, which is basically a puff piece.
01:37:17
Speaker
And like how many actual articles from... like you know the film going... There's actual like critics and people who write about this stuff on Twitter and and and yeah I think he's just like asking, like hey, is anyone going to say anything about any of this? I don't think he just means, yes, individually we should be ah we should give a shit about this. But also, like, it the alarm needs to be raised at all levels. And that includes, like, you know, have this platform I have. We both have some kind of following. I think that that's something that we could use that for so for some kind of good. If that does something or just but it makes people more aware of of this thing, had...
01:38:04
Speaker
i had I don't see how anyone can have any objection to it unless you're a goddamn collaborator. That's all I guess. You're exactly right on that. Right. And I'm glad that you brought up the fact that Runk is a filmmaker himself. Right. So when he was faced with that apathy from other people. Right.
01:38:19
Speaker
They weren't just being apathetic to his talking points. They were being apathetic to his livelihood. They were saying, I don't care about this but or what you have to say because of the way you've said it rather than come to terms with what his message was, right?
01:38:34
Speaker
they they They were ignoring the message to just talk about aesthetic differences, right? To talk about like things that ah they could, like easy punching bag points, right?
01:38:45
Speaker
Rather than coming to terms with what he's actually saying. And it's because it's it's coming from a place of intellectual superiority in a conversation rather than, ah you know, again, talking about the actual problem, right?
01:38:56
Speaker
And that's a fundamental issue with film Twitter. That's a fundamental issue that we need to leave in the past. If anyone's listening to this is for whatever reason, taking what we have to say seriously, you know, that's that's what we need to do. Right. Is the more that we lash out at smaller voices like a runk, you know, who is literally like as indie as you get, it I guess, in that term. Right.
01:39:17
Speaker
Right. If you're going to get mad at him for telling you to suck a dick because you're not listening to him. You're a collaborator at that point. At that point, you are like literally feeding into the thing that he's arguing against.
01:39:29
Speaker
And like as you were saying, right, if there's only one article and it's from The New Yorker and it's a puff piece, right, you need to get investigative journalists into It's not about like sharing somebody's take that already exists. And they it's not about retweeting what even I said, right?
01:39:44
Speaker
You need to take that extra step. It can be good in terms of just the general you want to push stuff. the energy in the right direction of of people's outrage about this, but, like, not everyone is on, you know, like, ah your mom or, you know, people who don't, you know, just do you actually read print media, you know, they're not going to maybe be aware of these things so i that's where that comes in and it it's it almost doesn't feel coincidental that like as all these ah you know publications art have laid off so many staff and like the actual film journalism is is kind of being snuffed out and just replaced with like influencers and stuff like that that like
01:40:34
Speaker
all the these these corporations are trying to push through more insidious things because they're like, who the fuck's going to call us on it? Like, you won't even know that we're doing it. the And that's why it's important as we have the time now to push that, right?
01:40:49
Speaker
Because it's not set in stone. They will change their ah mind, right? Any time in history, right, where a big change happened, right, it seemed insurmountalable insurmountable at the time.
01:40:59
Speaker
And it was done away with quickly, right? And the only reason that things like that go away so quickly is because of public support, right? The piece is already in place. Everyone already doesn't like AI, right? Everyone already doesn't like the things that it creates and what it represents, right?
01:41:14
Speaker
So... Just take the extra step. Just keep leaning. Keep pushing. Right. Because if you do just that bare minimum. Right. Rather than going after other people. You know. You're going to find a lot more

Directing Criticism Toward Material Harm

01:41:25
Speaker
success. Than just kicking your feet and saying. Oh well.
01:41:28
Speaker
I watched a Wong Kar Wai film instead. And I know that that's good. Right. Yeah. When you're doing that. You're just sitting on the sidelines. Right. Yeah, I don't know. Apply some of that snark to the actual people out there creating material harm in the world instead of to someone who just wants people to give a shit. Exactly.
01:41:47
Speaker
Yeah. And that extends to those smaller filmmakers we were talking about the before. like ah like is Like Robert Eggers and Ari Aster, right? What material harm are they doing in the world by making those movies?
01:41:58
Speaker
None. Yeah. So why are they villains, right? I just don't like their vibe. I don't know. it's just a sigh I don't his face. but They make me feel bad.
01:42:10
Speaker
yeah looks see evil He looks like he thinks he's smarter than me. Just by looking at him, I can tell how many books he's read. see Yeah. I mean, you know Eggers has read a fuck ton of books. And and then that's why i don't like him. Because I don't even finish the books I check out from the library.
01:42:29
Speaker
I don't even read books anymore. I just feel them. I just, like, take the page and I rub them between my hands like this, you know, and and I get all of the information from Friction.
01:42:40
Speaker
I mean, hey, that's more a lot of people do, so I think i think you're getting something there. I'm just being sympathetic to people who read Braille. yeah Why didn't Burgess Meredith go find the Braille books at the end? Just go ahead.
01:42:56
Speaker
glen You know, Twilight Zone is supposed to be a reality tale. And, you know, a lot of times it's like had tracks of like, oh, yeah, these people were their own undoing. His crime was that I guess that he was happy that everyone was that like it's just he had a ah shitty a attitude about something that already happened. But what else is he supposed to do? The world already ended. So it's not like whether he's like sad at humanity being gone or not.
01:43:23
Speaker
If anything, he should be rewarded for having a can-do attitude of like, oh, well, silver lining, get some reading done. it's It never fucked over literally everyone else, but at least I got mine. Right. And right. Again, that feeds into the AI conversation. Right. Where it's like if you are only like accepting the thing that exists and how so far it profits your bottom line or makes your life better. Right.
01:43:48
Speaker
And you're being selfish. It's not a principled standpoint. Right. um That's why the Twilight Zone is timeless, you know. It was about AI. I didn't even know. You know, actually know. I think Rod's, almost said Rod Stewart.
01:44:02
Speaker
Rod Serling. Rod Serling was here. Rod Serling actually said to me before he died, he said, Doug, this is a pencil. This is a tool. It's just like, just like any other tool. AI is a tool.
01:44:19
Speaker
Man, Natasha Lyonne, I don't know why we can make fun of her so much, but she she's coming up as a villain on this podcast, and it's really fun. She really showed her own her whole ass this year. I feel like the people were kind of like,
01:44:35
Speaker
yeah I feel like in that kind of schtick only has, can carry you so far. And i it it already felt like the the energy was was waning. But then the fact that like, the oh oh, you're actually full of shit.
01:44:47
Speaker
yeah Because that's all it's like, you have this power in the platform. She's, you know, creating a studio and stuff. And it's like, that's going to be part of your your thing. Like, you that's what you're using your ah your power to do. It's just so incredibly lame, right?
01:45:01
Speaker
yeah and and And like you said, it ruined her social cachet. Right. And that should give people who are in the margins who are like, should i even care about AI? That itself, what happened around Natasha Lyonne should give you the entire reason to go against it. Right.
01:45:17
Speaker
She lost all that cachet. All of a sudden, nobody was watching poker face anymore. Like season two just came out around the same time she made made those comments. I didn't see a single person talking about it. You know, and no one's saying how good it was beyond like film critics. Right.
01:45:32
Speaker
so the thing is, is like, ah co Like they are taking the wind out of their own sail by doing that. And you do not need to ah protect them in any way. Right. And watching Tashleone's own undoing. Right. Should be a lesson to everybody that like you give them the rope to hang themselves with and they'll do it for you in a very fast, like extravagant way.
01:45:51
Speaker
And we can all sit on the sidelines and laugh together because we're ideologically in opposition to their goals. It's a lot more fun this way, guys. Trust me. It's actually if you support AI, you won't be cool and you want to be cool, right? You want to be part the in crowd.
01:46:06
Speaker
Oh, come on and It's it's the easiest way to make this argument, right? It's like how ah so if you base your ideology around just like not being a dick to people, right? like You will, you will like fall into a lot of great arguments, right? Supporting great causes, right?
01:46:23
Speaker
Just based on that soul through line. And because of that, you're also cool by a subsect of just not being a dick, right? The same principle applies in this, right? Where if your baseline ideology is true, if it comes from a sincere place, that's actually not, you know, hateful or built out of spite, you know,
01:46:42
Speaker
You don't even have to do the rest of the work. Well, you do. You should do the rest of the work. But you need to make sure that you have those principled ideologies in place when you're making these arguments, because that's how you're going to make any kind of progress forward.
01:46:55
Speaker
Amen. i mean, that's the I don't I don't have anything else else to to add to that. I mean, that's yeah, that's that's what I got. Happy Labor Day, everyone.
01:47:09
Speaker
Do rest labor. do less labor Yeah, don't get confused because there's labor in the the title. You're like, do I work today? No, it's it's it's kind of a misnomer like that. Like if you if you play the game, look outside, don't look outside because you'll get a game over screen immediately. I made that mistake. And you you look outside, you could turn into like an eldritch store or something. So don't do don't do it.
01:47:35
Speaker
That's just bad game design. They told you to look outside, then you did it. Like at that point, they just lied to you. That's cheap, you know? A voice just tells you like, hey, look outside. I'm like, OK, the game game told me to do it.
01:47:47
Speaker
The window was highlighted in yellow, so I knew I should have done it. You know, although I do like that trend of game, especially like when action games started being like, hey, why you just kill all those people we told you to kill?
01:47:58
Speaker
was fucked up. Like, wait, a you told me to do this. You put the game mechanic in. You made it fun, asshole. i didn't I didn't want to explode those dogs in The Last of Us 2, but you gave me explosive arrows, and the dogs are scary. So don't know what the fuck I'm supposed to do about it.
01:48:22
Speaker
He's never going to stop being the juice.