Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Toward a Luddite Pedagogy in the "Age of AI" w/ Charles Logan image

Toward a Luddite Pedagogy in the "Age of AI" w/ Charles Logan

E150 · Human Restoration Project
Avatar
23 Plays11 months ago

“Were we required to characterize this age of ours by any single epithet, we should be tempted to call it, not an Heroical, Devotional, Philosophical, or Moral Age, but, above all others, the Mechanical Age. It is the Age of Machinery, in every outward and inward sense of that word; the age which, with its whole undivided might, forwards, teaches and practices the great art of adapting means to ends. Nothing is now done directly, or by hand; all is by rule and calculated contrivance. For the simplest operation, some helps and accompaniments, some cunning abbreviating process is in readiness. Our old modes of exertion are all discredited, and thrown aside. On every hand, the living artisan is driven from his workshop, to make room for a speedier, inanimate one. The shuttle drops from the fingers of the weaver, and falls into iron fingers that ply it faster.”

This is how Scottish historian & writer Thomas Carlyle characterized Great Britain’s mechanized, steam powered industrial era in 1829. These changes in the human relationship to production rippled through the world economy with profound social, political, & environmental implications. One loosely organized group, the Luddites, emerged early on to smash the new machines and resist mechanization of the mills.

200 years after Carlyle’s “Age of Machinery”, we find ourselves sold a new Age, the Age of automation and AI, which promises another transformation in the way we live, work, AND learn, with similar social, political, and environmental consequences. At least, the AI-hype cycle is real. Sal Khan’s new book, for example, Brave New Words: How AI Will Revolutionize Education (and Why That's a Good Thing) promises to be “required reading for everyone who cares about education.”

But what should be the relationship of education, automation & artificial intelligence? Should there be one at all? How much power – not to mention student data – should educators cede to the new machine in the Age of AI? 

Or…should the answer be a 21st century Luddite revival and mass resistance to the vision of the future offered by Google, OpenAI, and Microsoft?

That, I suspect, will be the argument of my guest today, Charles Logan, a Learning Sciences PhD Candidate at Northwestern University, writing earlier this year for the Los Angeles Review of Books, “Ultimately, the Luddites’ militancy and commitment to resistance might be a necessary entry point for how laborers—and teachers, students, and caregivers—can take an antagonistic stance toward AI and automation, and create a new ‘commons.’”

Toward A Luddite Pedagogy

Should We Be More Like The Luddites?

Inspiration from the Luddites: On Brian Merchant’s “Blood in the Machine”

Learning About and Against Generative AI Through Mapping Generative AI’s Ecologies and Developing a Luddite Praxis

Record being placed on a record player.wav by HelterSkelter1114 -- https://freesound.org/s/409036/ -- License: Attribution NonCommercial 4.0

rope-making machinery running.wav by phonoflora -- https://freesound.org/s/201166/ -- License: Attribution 4.0

Recommended
Transcript

Introduction to the Conference to Restore Humanity 2024

00:00:06
Speaker
2022 was a system reboot.
00:00:09
Speaker
2023 broke the doom loop.
00:00:11
Speaker
This year is all about turning vision into reality.
00:00:15
Speaker
Conference to Restore Humanity 2024 is an invitation for K-12 and college educators to build your joyful, reimagined classroom.
00:00:25
Speaker
Our conference is designed around the accessibility and sustainability of virtual learning, while engaging participants in an environment that models the same progressive pedagogy we value with students.
00:00:37
Speaker
Instead of long Zoom presentations with a brief Q&A, our flipped keynotes let the learning community listen and learn on their own time, then engage in a one-hour Q&A with our speakers.

Progressive Education and Global Equity

00:00:49
Speaker
Dr. Mary Helen Imordino Yang makes the neurobiological case for progressive education rooted in her groundbreaking work in affective neuroscience.
00:01:00
Speaker
Dr. Carla Shalabi demonstrates the power of education as the practice of freedom, honoring young people's rights to full human being.

Workshops on Interdisciplinary Methods and Global Solidarity

00:01:09
Speaker
Dr. Sausen Jaber elevates the voices of Arab and Muslim students as an advocate for global equity and justice.
00:01:17
Speaker
And Orchard View School's Innovative Learning Center showcases healthy, sustainable community learning spaces for teenagers and adult learners alike.
00:01:26
Speaker
Beyond our flipped keynotes, participants will be invited to join week-long learning journeys.
00:01:32
Speaker
Join Trevor Elio on a journey to learn interdisciplinary inquiry-based methods to equip students as knowledge producers, communicating with zines, podcasts, and more.
00:01:43
Speaker
and understand the ripple effects of modern imperialism with a focus on Palestinian resilience and classroom tools for fostering global solidarity in our second workshop led by Abir Ramadan Shinawi.

Conference Details and Logistics

00:01:57
Speaker
We're also featuring virtual school tours so you can see progressive practice in action at the Nova Lab, Olintangi STEM Academy, Community Lab School, and more.
00:02:07
Speaker
The mission of reshaping education systems
00:02:10
Speaker
of turning vision into reality is vital for a sustainable and just future.
00:02:16
Speaker
Conference to Restore Humanity runs July 22nd through the 25th, and as of recording, early bird tickets are still available.
00:02:24
Speaker
It's $150 for four days with discounts available, group rates, and parity pricing.
00:02:30
Speaker
Plus, we'll award certificates for teacher training and continuing education credits.
00:02:35
Speaker
See our website, humanrestorationproject.org, for more information.
00:02:39
Speaker
And let's restore humanity together.

Introduction to Episode 150 and Overview of the Human Restoration Project

00:02:58
Speaker
Hello and welcome to episode 150 of the Human Restoration Project podcast.
00:03:03
Speaker
My name is Nick Covington.
00:03:05
Speaker
Before we get started, I wanted to let you know that this episode is brought to you by our supporters, three of whom are Kimberly Baker, Simeon Frang, and Corinne Greenblatt.
00:03:15
Speaker
You can learn more about Human Restoration Project on our website, humanrestorationproject.org, and connect with us anywhere on social media.

Historical Impact of Machinery and Modern Automation

00:03:28
Speaker
Were we required to characterize this age of ours by any single epithet, we should be tempted to call it not an heroical, devotional, philosophical, or moral age, but above all others, the mechanical age.
00:03:42
Speaker
It is the age of machinery in every outward and inward sense of that word.
00:03:47
Speaker
The age which, with its whole undivided might, forwards, teaches, and practices the great art of adapting means to ends.
00:03:56
Speaker
Nothing is now done directly or by hand.
00:03:58
Speaker
All is by rule and calculated contrivance.
00:04:02
Speaker
For the simplest operation, some helps and accompaniments, some cunning abbreviating process is in readiness.
00:04:09
Speaker
Our old modes of exertion are all discredited and thrown aside.
00:04:13
Speaker
On every hand, the living artisan is driven from his workshop to make room for a speedier, inanimate one.
00:04:20
Speaker
The shuttle drops from the fingers of the weaver and falls into iron fingers that ply it faster.
00:04:27
Speaker
This is how Scottish historian and writer Thomas Carlyle characterized Great Britain's mechanized steam-powered industrial era in 1829.
00:04:36
Speaker
These changes in the human relationship to production rippled through the world economy with profound social, political, and environmental implications.
00:04:46
Speaker
One loosely organized group, the Luddites, emerged early on to smash the new machines and resist mechanization of the mills.

Critical Perspective on AI in Education

00:04:56
Speaker
200 years after Carlisle's age of machinery, we find ourselves sold a new age, the age of automation and AI, which promises another transformation in the way we live, work, and learn, with similar social, political, and environmental consequences.
00:05:13
Speaker
At least the AI hype cycle is real.
00:05:17
Speaker
Sal Khan's new book, for example, Brave New Words, How AI Will Revolutionize Education and Why That's a Good Thing, promises to be, quote, required reading for everyone who cares about education, end quote.
00:05:31
Speaker
But what should be the relationship of education, automation, and artificial intelligence?
00:05:36
Speaker
Should there be one at all?
00:05:38
Speaker
How much power, not to mention student data, should educators cede to the new machine in the age of AI?
00:05:46
Speaker
Or should the answer be a 21st century Luddite revival and mass resistance to the vision of the future offered by Google, OpenAI, and Microsoft?

Understanding the Luddite Movement

00:05:56
Speaker
That, anyway, I suspect, will be the argument of my guest today, Charles Logan, a learning sciences PhD candidate at Northwestern University.
00:06:05
Speaker
Writing earlier this year for the Los Angeles Review of Books,
00:06:09
Speaker
Ultimately, the Luddites' militancy and commitment to resistance might be a necessary entry point for how laborers and teachers, students, and caregivers can take an antagonistic stance towards AI and automation and create a new commons.
00:06:26
Speaker
Thank you so much, Charles, for joining me today.
00:06:29
Speaker
Thanks for having me, Nick.
00:06:29
Speaker
I'm excited to be chatting with you all today.
00:06:31
Speaker
So I think in the popular imagination, a Luddite is like an octogenarian congressperson who doesn't use email or someone who's like entirely technophobic, perhaps even culturally conservative, right?
00:06:45
Speaker
Preserving the old ways against progress.
00:06:47
Speaker
Yeah.
00:06:49
Speaker
One of the earliest pieces that you shared with me is from Hybrid Pedagogy, and it was written in 2014, fully a decade ago, right?
00:06:56
Speaker
And by an author going by torn halves called Toward a Luddite Pedagogy.
00:07:01
Speaker
And I was reading back through that and saw that it makes no mention of AI.
00:07:06
Speaker
So it's clearly an idea that existed alongside tech for at least that long.
00:07:11
Speaker
So help us understand who the Luddites were.
00:07:15
Speaker
What were they about?
00:07:16
Speaker
And what can we learn from
00:07:18
Speaker
from how they approached technological change in their own time.
00:07:22
Speaker
Sure thing.
00:07:23
Speaker
Yeah, I will do my best to sort of situate the Luddites in history.
00:07:27
Speaker
And working on a piece right now with Phil Nichols and Entero Garcia that tries to track what we've come up with three different waves of Luddites.
00:07:35
Speaker
But I think it's important to start with the OGs, the original Luddites.
00:07:40
Speaker
So as you mentioned, it's the early 19th century in England, and you have the long history of different sorts of cloth workers,
00:07:49
Speaker
who all of a sudden are faced with automation.
00:07:52
Speaker
And so you have early industrial capitalists who are building factories, who are shifting the nature of labor because of these technologies that are displacing people, are allowing capitalists to depress wages, and that are having these rippling effects on workers themselves.
00:08:13
Speaker
but also their communities.
00:08:14
Speaker
And so these automating technologies are threats to both the livelihood of these laborers, but also the dignity in which they conceive of themselves and their work and the great sort of traditions that have brought them to this place in history.

Reframing Luddite Legacy in Modern Context

00:08:30
Speaker
And so the Luddites are actually named after a fictional character, Ned Ludd,
00:08:35
Speaker
who the story goes was punished by one of his bosses, essentially, and then decided to destroy the automating machine, Spinning Jenny or one other kind of mill player.
00:08:50
Speaker
And so emerges as this folk hero, again, in a part of the world where we're not strangers to folk heroes like Robin Hood.
00:09:00
Speaker
And then you have, as you mentioned, this loose, but also connected set of political projects that go under the name of the Luddites across England in the early 19th century.
00:09:12
Speaker
And you have folks who
00:09:13
Speaker
are strategically under the cover of darkness, breaking into factories, breaking into factory owners' homes, and destroying what they called the obnoxious machines, the machines that were threats to their livelihood, to their dignity.
00:09:31
Speaker
And so again, I will emphasize it is strategic sabotage.
00:09:35
Speaker
It is not haphazard.
00:09:36
Speaker
And so as you mentioned, over the last 200 plus years,
00:09:41
Speaker
Luddite has become a pejorative.
00:09:43
Speaker
I think I was looking at one of dictionary.com has those words of the day and a corresponding image.

Skepticism Towards Tech Giants and AI in Education

00:09:49
Speaker
And the corresponding image was, as you said, an octogenarian looking at her phone as if it were a fish.
00:09:56
Speaker
And so we've seen these sort of projects then to rehabilitate and sort of reframe the Luddites over the last 250 plus years.
00:10:06
Speaker
And so that happened with historians in the early to mid 20th century.
00:10:10
Speaker
And then you have moving into the 1970s,
00:10:13
Speaker
what Phil and Antero and I have called the second wave of Luddism.
00:10:18
Speaker
And that is a different group of people.
00:10:22
Speaker
You have folks like Kirkpatrick Sale is one of the leaders of these new Luddites who are questioning, you know,
00:10:30
Speaker
technology at a time of the Cold War and sort of the growth of nuclear energies and weapons and environmentalists and pacifists, Quakers.
00:10:39
Speaker
And so there's more of sort of this motley crew and this big tent.
00:10:44
Speaker
But unlike their predecessors were often avowed pacifists, as I mentioned.
00:10:50
Speaker
And so they're moving away from that original tactic of physical destruction of infrastructure.
00:10:56
Speaker
and more of a kind of set of beliefs to organize around.
00:11:01
Speaker
And then the, you know, the term kind of falls out of favor again.
00:11:06
Speaker
And it's only been, I would say, in the last sort of,
00:11:09
Speaker
Well, you mentioned Tornhav's piece from 2014 and Audrey Waters picks up on that work from a few years ago.
00:11:16
Speaker
And then you also have this sort of growth in a lot of tech critics like Brian Merchant and Paris Marks and others who are reclaiming the name and the identity of a Luddite.
00:11:29
Speaker
It is a very fluid term.
00:11:31
Speaker
I think that is something to acknowledge that has changed over the course of time.
00:11:37
Speaker
Now, I think, as you mentioned, as we have the solid cons of the world who are doing their best to infuse our classrooms with their proprietary chatbots, I think we can look to the Luddites again and think about, well, what would it mean to practice this Luddite praxis and what kinds of interventions, what kind of tactics, what kind of sabotage, what kind of organizing can be done and inspired by the Luddites here in

Critique of AI-driven Education Promises

00:12:03
Speaker
the year 2024?
00:12:03
Speaker
Yeah.
00:12:05
Speaker
Well, let's go ahead and bring in, since you mentioned Sal Khan, since we're kind of transitioning into talking about what would be considered the third wave in your take on third wave Luddism here.
00:12:17
Speaker
So you've been skeptical of announcements from OpenAI's Sam Altman, Microsoft's Bill Gates, and kind of working in tandem here.
00:12:26
Speaker
And you've been sharing your criticism of Sal Khan's newest book on social media, which I've enjoyed reading there too.
00:12:31
Speaker
So what even is that transformation that they're promising?
00:12:34
Speaker
What tactics of the new modern industrial capitalists are the third wave Luddites reacting to?
00:12:42
Speaker
Can you outline that for us?
00:12:44
Speaker
Yeah, I mean, a lot of it comes down to the same dog and pony show that Saw Khan has been selling for a long time, and that is personalized learning.
00:12:52
Speaker
And he has this real sort of superpower to forget any sort of history.
00:12:57
Speaker
Maybe he never learned it in the first place.
00:12:59
Speaker
It's been strategic about his own ignorance.
00:13:02
Speaker
And again, as an avowed Audrey Waters fanboy, she's written about the
00:13:07
Speaker
the history of personalized learning and begins her book, which is, I believe the title is, you know, the history of personalized learning or teaching machines, the history of personalized learning, MIT press go by it now.
00:13:18
Speaker
And she starts with an anecdote about Saul Kahn and how he, you know, is sort of makes this entree into education after having been a hedge fund analyst, which, you know, again, I don't want to discount the possibility that someone could move from the business world into education and, you know,
00:13:36
Speaker
bring with them and develop a critical pedagogy.

Ecological Approach to AI and its Impacts

00:13:40
Speaker
I'd argue Selkan is not that person.
00:13:42
Speaker
And so it is, again, I think the story of personalized learning, of we can all of a sudden flip on this chatbot or open up the chatbot and here you have a personalized tutor who is capable of getting to know you, of providing real-time feedback, and all of a sudden you can achieve greatness.
00:14:05
Speaker
Reading the book, as painful as it is, I think that sense of automation of what it's like to be a student, what it's like to be a teacher, I think has the echo of early industrialists who are pushing similar sorts of automating technologies.
00:14:22
Speaker
And again, you see who's blurbed the book.
00:14:23
Speaker
You mentioned Bill Gates, the current CEO of Microsoft, blurbed the book.
00:14:29
Speaker
Wendy Kopp, blurb of the book, who's the founder of Teach for America and is very sort of problematic approach to sustainable teaching in our most vulnerable and undersourced communities.
00:14:42
Speaker
And again, technology being turned to as a kind of panacea.
00:14:46
Speaker
which again, there's a precedent for that.
00:14:48
Speaker
And that's B.F.
00:14:48
Speaker
Skinner and his teaching machines who sought to bring these teaching machines into Harlem and other under-resourced communities as, again, answers for complex social problems that date back decades, if not centuries.
00:15:03
Speaker
And so this techno-solutionism as, again, the kind of panacea that Khan is pushing in conjunction with folks like, he refers to the
00:15:11
Speaker
Sam Altman as Sam, and he went to go see Bill Gates right after he saw Sam.
00:15:15
Speaker
And so there's this very sort of like chummy group.
00:15:18
Speaker
And so, you know, reading this and thinking about Audrey Waters' work and others have been very influential and thinking about, well, this turn to Luddism, what does that offer us?
00:15:29
Speaker
Because I think it's important to acknowledge that, and this is something that I've wrestled with in thinking about, well, what are potential shortcomings of Luddism, is you can't easily smash a chatbot.
00:15:42
Speaker
You could smash the Chromebook that you're using and then have to turn around and probably pay your school.
00:15:51
Speaker
And so there's a real kind of limitation to what Luddism looks like in the 21st century that I don't think
00:15:59
Speaker
enough folks who have claimed that as an identity acknowledge.
00:16:05
Speaker
And yet, I think that that spirit of sabotage, of a more militaristic stance towards the technology, a more intentionally aggressive stance is helpful because there are these emerging technologies, Nightshade, I think is one of them, where you can
00:16:25
Speaker
upload images that essentially will poison a data set.
00:16:29
Speaker
And so there are these tools that exist that, again, I would argue are Luddite tools.
00:16:35
Speaker
Their makers may not claim them as Luddite, but I think we can view them through that lens.
00:16:40
Speaker
And then I also think that skepticism, that organizing was, I think, an important piece of the Luddite's
00:16:49
Speaker
The original Luddites are organizing under the cover of darkness and these sort of back room, you know, taverns and trying to figure out, like, what do we do about this technology?
00:17:02
Speaker
How do we how do we counter it?
00:17:04
Speaker
And so I think there's a lesson there for for teachers, for students who are all of a sudden facing problems.
00:17:11
Speaker
these technologies and being told that this is the future of education and that that organizing is an important piece to pull out from from the Luddites as well as those tactics of sabotage as well as the playfulness of Luddites so you know they're
00:17:29
Speaker
writing poems, they are inspiring poems.
00:17:33
Speaker
Lord Byron is, I think, the most famous poet who kind of came to the Luddite cause.
00:17:38
Speaker
They're sending these missives to various political leaders and nailing these threatening notes to doors.
00:17:45
Speaker
They're dressing in drag.
00:17:47
Speaker
And so I think there's this sense of deadly seriousness alongside that playfulness.
00:17:52
Speaker
And I think there's a lot to be said about that playfulness as sort of like a public pedagogy that
00:17:59
Speaker
We see in education and we see, I think, also, you know, folks who I'm thinking of, like Emily Bender and Alex Hanna have a wonderful podcast.
00:18:09
Speaker
It's like a live stream.
00:18:11
Speaker
It's called Mystery AI Hype Theater 3000, something like that.
00:18:16
Speaker
Yeah.
00:18:18
Speaker
And they basically just like, I don't know if I can say this on the show, shit talk.
00:18:24
Speaker
These like latest like AI hype, like an article or whatever it might be and across different disciplines, including education.
00:18:31
Speaker
And I think it's a profoundly like educative act to like have this show that they then put out in a podcast, in a newsletter, the video.
00:18:41
Speaker
I mean, it's hosted not on YouTube.
00:18:44
Speaker
So you're not putting money in Alphabet's pocket.
00:18:47
Speaker
And so, again, I think there are those tactics of sabotage, of organizing, of playfulness, and also acknowledging that even with the historical Luddites, the Luddites and their kind of projects were connected, but they were also hyper-local to the different regions.

AI Literacy and Ecological Impacts

00:19:01
Speaker
And so acknowledging that what we as educators are doing with our students is always contingent on context.
00:19:08
Speaker
It's always contingent on the people in the room and the constraints that we're facing, given who we are and where we are.
00:19:16
Speaker
And so I think that's an important piece to uplift from the original Luddites as well, is that a lot of the Luddite, these different projects, while loosely connected around, you know, sabotage and organizing to protect their livelihood and dignity against the incursion of these automating technologies that do not have, you know, uniform sets of beliefs or uniform goals.
00:19:39
Speaker
And I think that's something that we can also hold on to as we think about our own kind of like
00:19:44
Speaker
heterogeneity in 2024 and thinking about what would it look like
00:19:49
Speaker
to practice a Luddite praxis in response to this age of generative AI that we find ourselves in.
00:19:59
Speaker
And I don't even like calling it the age of generative AI.
00:20:02
Speaker
That I feel like puts it up on a pedestal that it should not be.
00:20:07
Speaker
We don't need to cede that ground to them.
00:20:09
Speaker
It's this flashpoint of this sizzle of AI that
00:20:15
Speaker
An education that will more than likely burn out, hopefully sooner rather than later.
00:20:21
Speaker
And I think that's a great way to get into the obfuscation that the AI hype cycle, it's the age of AI hype if it's not the age of generative AI, right?
00:20:31
Speaker
It's the age of the AI salesman or...
00:20:34
Speaker
You know, lump cryptocurrency somewhere in there, too.
00:20:36
Speaker
But, you know, as I'm looking at the blurbs for Sal Khan's book, right from Arnie Duncan, Tony Blair, you mentioned Wendy Kopp, formerly of Teach for America, now of Teach for All.
00:20:49
Speaker
Again, Sam Altman, Adam Grant, Bill Gates, right, all of these people.
00:20:53
Speaker
And in their blurbs.
00:20:54
Speaker
obfuscating something that's a little bit deeper.
00:20:56
Speaker
And I think it's something that you really had elaborated on in some work that you shared with me about understanding an ecological framework that actually helps us see past that hype cycle and understand the relationships between generative AI and how it's entwined with so many other aspects of life.
00:21:14
Speaker
And I think as far as a third wave Luddism go, I think those ecologies are really important to recruiting people to see past
00:21:22
Speaker
you know, that obfuscation and that veil of, hey, here's how it's going to transform education.
00:21:26
Speaker
But hey, how is that going to impact the environment or human labor or any of those other features?
00:21:32
Speaker
Could you explain what you mean by that ecological framework and why that matters?
00:21:38
Speaker
Yeah, I'd be happy to.
00:21:40
Speaker
And so let me start by contrasting it, I think, with the more common framework that I've encountered,
00:21:47
Speaker
when working with young people and teachers about AI and learning with AI, and that's AI literacy.
00:21:55
Speaker
And so I think AI literacy,
00:21:57
Speaker
One of the critiques of literacy as a framework that I'm familiar with is that literacy tends to focus on representative forms.
00:22:06
Speaker
And so I think that's where you can see, you know, the let's develop an AI literacy curriculum that is going to be based on the assumption that these technologies should be used.
00:22:17
Speaker
But let's be careful about how we use them because the technology, they're bullshit machines.
00:22:23
Speaker
And they're just, you know, mathy math that is going to spit out like predictive text.
00:22:28
Speaker
And that text is biased.
00:22:30
Speaker
And so let's attend to the bias.
00:22:31
Speaker
And let's be careful not to just take whatever comes out of a chatbot as the truth.
00:22:38
Speaker
So there's like that conversation.
00:22:40
Speaker
Obviously, there's the whole plagiarism conversation.
00:22:42
Speaker
And those are important conversations to have.
00:22:45
Speaker
But I don't think...
00:22:47
Speaker
they do the work that we need to really understand how these technologies are operating and then also don't open up enough space for interrogating and intervening in the technologies.
00:23:00
Speaker
So I'm all for an AI literacy that is expansive in its approach that pushes back against the assumption that AI should be used in schools.
00:23:11
Speaker
I'll also note that, and this is work that
00:23:13
Speaker
Phil Nichols has done and others around what they call the capture of AI literacy and the notion that literacy as a project is a speculative one of sort of thinking about what the world could be and a hopeful place.

AI's Role in Climate Change and Power Structures

00:23:27
Speaker
And so that's when literacy is at its best.
00:23:30
Speaker
But that
00:23:31
Speaker
Big tech companies are able to use these literacy frameworks as a means of what Lucy Pangrathia calls our soft power of governance and framing the technologies as necessary.
00:23:43
Speaker
Again, with some caveats of like, we know they're biased and we know they're potentially discriminatory.
00:23:49
Speaker
But, you know, that's for you all to figure out.
00:23:51
Speaker
And here's this curriculum you can use to figure it out.
00:23:53
Speaker
So that, you know, AI literacy, I've got a complicated relationship with.
00:23:59
Speaker
The ecological framework, I think, is something that I find more compelling in part because it moves beyond just, you know, the interface of a platform on your device.
00:24:10
Speaker
And it takes into account these, I just mentioned these obfuscated systems of labor.
00:24:16
Speaker
So you have, right, folks who
00:24:18
Speaker
Often in the global South, doing ghost work is often what's referred to, the notion of unseen labor, where they're training the data that makes AI AI, and AI is people all the way down.
00:24:31
Speaker
So they're doing this really gruesome work of labeling toxic material, traumatizing material.
00:24:38
Speaker
And so I think that's one dimension of technology.
00:24:42
Speaker
that ecological framework that's important to note, labor.
00:24:46
Speaker
I said it's people all the way down, AI is, but it's all the way down into the very waters and earth and soil and water that make up our planet.
00:24:56
Speaker
And from the mining that goes into producing the computer chips, the materials for the data centers in which the infrastructure of AI is rapidly being spread throughout the world.
00:25:11
Speaker
And again, the amount of water required to cool data centers, to cool the servers that make not just AI, but the internet possible.
00:25:18
Speaker
So there's like a real material aspect to the technology that I think is important for students to interrogate, especially because these are the young people who are going to bear the brunt of climate change.
00:25:30
Speaker
And so I find it increasingly difficult to square the narrative that technology, AI technology specifically, is going to revolutionize education at a time when that same technology is increasing the
00:25:45
Speaker
effects of climate change is further entrenching the power of fossil fuel companies.

Balancing AI's Educational Benefits and Ethical Considerations

00:25:49
Speaker
And there's a headline today about the US continuing coal production because we need for energy consumption for AI production.
00:25:59
Speaker
So it is this, I think, hypocrisy and a real set of compromises that I think folks need to confront.
00:26:06
Speaker
And this is what I tried to do with my own students is like, if you're going to use these technologies, there are real material harms that they're doing to people
00:26:14
Speaker
to planet, to humans, to more than humans.
00:26:17
Speaker
And, you know, you have alluded to this, but we also see, right, the ideological and financial benefits that go to people like Bill Gates and Sam Altman.
00:26:25
Speaker
There's been writing about what's called the test realist bundle from people like Emil Torres and to Nick Garou and this sort of
00:26:33
Speaker
set of ideologies essentially is like eugenics and like the notion of like AI as some sort of superhuman, you know, project that is going to, I'm doing this, a very slapdash version of the test career list bundle.
00:26:47
Speaker
We'll add some notes in the show notes, I suppose.
00:26:50
Speaker
And in short, I think what an ecological framework offers is a more expansive set of questions, a more expansive set of sort of lines of inquiry than these kind of traditional corporate
00:27:03
Speaker
AI literacy curricula offer.
00:27:05
Speaker
Yeah, I was just going back and reading your tweet in response to the plan to, or to unretire coal-fired plants as power demand from AI surges.
00:27:15
Speaker
And you wrote, AI is supposedly revolutionizing the future of education while making the actual future increasingly bleak.
00:27:21
Speaker
I don't know if there's like a sci-fi analogy or a monkey's paw or something like that, but it's literally as we're revolutionizing one space and immiserating the future in another one.
00:27:31
Speaker
It's just like there's a direct cost related to all of those things.
00:27:35
Speaker
I think...
00:27:36
Speaker
Perhaps like a criticism that I've seen of this Luddism when I had kind of posted a preview of our conversation that came up was about its potential to help with assistive technologies, the possibility to be a huge boon for disabled people, or to even be assistive in the sense of providing accurate translation services or transcription services.
00:28:00
Speaker
And I think I'll mix my literary analogies a little bit more.
00:28:03
Speaker
My brain is kind of thinking in the sci-fi space, but
00:28:06
Speaker
I went to almost kind of like this William Gibson neuromancer cyberpunk space in the sense where, right, you have
00:28:13
Speaker
this overarching corporate structure and evil technology sort of predominates, but you have these niches of DIY, mutual aid, and the like kind of coexisting alongside that, like that's invisible to the mainstream.
00:28:26
Speaker
I suppose that's not ideal.
00:28:29
Speaker
Or is it like the one ring?
00:28:31
Speaker
Should AI be buried deep in a mountain or thrown into Mount Doom because simply interacting with it has such a profoundly negative
00:28:40
Speaker
and transformative consequence for users and the environment as a whole, as you said, like embedding these ideologies at the cost of this invisible human labor.

Power Dynamics and Ethical Approaches to AI in Education

00:28:50
Speaker
And it's a miseration of the environment.
00:28:52
Speaker
So I wonder, like, I just don't know how this is the question that I grapple with, right?
00:28:58
Speaker
Like, how do we, oh gosh, I don't want to say take the good, but, you know, like use it for where it could be, like for more humane,
00:29:06
Speaker
pro-human purposes?
00:29:08
Speaker
Or is it radioactive in the sense that it's better off just not to touch it, develop some other technologies?
00:29:14
Speaker
I don't know how to resolve this.
00:29:15
Speaker
I'm hoping you'll help me out.
00:29:18
Speaker
What do we do?
00:29:19
Speaker
I came here for you to figure out my problems, not for me to figure out your problems.
00:29:23
Speaker
Yeah.
00:29:23
Speaker
So, I mean, again, I would think, you know, what is the, what are the historical Luddites from, you know, the early 19th century tell us?
00:29:29
Speaker
And one of the things, again, to emphasize is that they're not, they're not anti-technology, but they're anti the concentration of power into the hands of a very few, you know, white men and corporations.
00:29:40
Speaker
And flash forward, and we look at, you know, one example that,
00:29:46
Speaker
I think can provide a kind of blueprint for assistive technologies that you're talking about is there's an organization, Tehuku Media, and they are based in New Zealand, which is the westernized name of the indigenous lands.
00:30:05
Speaker
And they produce Maori chatbots and other AI for language acquisition and language revitalization.
00:30:14
Speaker
And I just saw one of their, like the chief information or chief technology officer came and spoke at the university recently.
00:30:21
Speaker
And there's a great episode featuring him on the podcast, Tech Won't Save Us.
00:30:26
Speaker
And so, you know, a few attributes of this conversation
00:30:30
Speaker
company and the technology that stand in stark contrast to the open AIs of the world is that they take data sovereignty very seriously.
00:30:40
Speaker
And so they have these from like cassette tapes to like CDs to, you know, digitized audio of several generations of Maori people speaking and telling stories.
00:30:53
Speaker
And they've approached these people and received their consent to use the use of these stories and
00:30:58
Speaker
to train these models to revitalize and maintain their language and to teach people, majority people, their language.
00:31:04
Speaker
And so that to me is one example of care for people, for their stories, for their data.
00:31:13
Speaker
It is not an attempt to scale.
00:31:16
Speaker
I think that notion of scale is so problematic in so many ways, especially when it comes to AI.
00:31:22
Speaker
but it is a community-based project where, yes, there are compute consequences, and I think they're wrestling with how can they use solar, how can they use other renewable energies for their projects, but it is done in conjunction with the community.
00:31:42
Speaker
And so they've been approached by...
00:31:45
Speaker
I think OpenAI and Google and others to kind of hand over their model or to help them train their other models, these automatic speech technologies.
00:31:56
Speaker
And they've said no.
00:31:58
Speaker
And so I think, again,
00:32:00
Speaker
if we're thinking about assistive technologies in schools or in other spaces, you know, how are, and again, you mentioned these, these sort of DIY, let's fund the DIY groups and, you know, learn from one another in a more decentralized way, rather than concentrate the funding and the compute power in the hands of like three companies or like NVIDIA is the only company who's making the, the, you know, the chips to run these, these, these technologies.
00:32:28
Speaker
So,
00:32:29
Speaker
That, again, I think is in the spirit of the historic Luddites.
00:32:35
Speaker
And I think, and again, they may not call themselves these Maori folks doing this work.
00:32:41
Speaker
They may not call themselves Luddites.
00:32:42
Speaker
And I think that's something to acknowledge, too, that if these sets of ideas are helpful, I think that's one of the arguments we make, then great.
00:32:50
Speaker
But we also, you know, the Luddites, and predominantly the people who have claimed that are white men.
00:32:56
Speaker
And so that is something to acknowledge and something to grapple with.
00:32:59
Speaker
And even today, I mean, I think, you know, that the group has become more expansive over time.
00:33:04
Speaker
So I think that's important to acknowledge.
00:33:07
Speaker
But I think that those tactics, that the notion of playfulness, of organizing, of sort of community-based solutions to technology that embrace technology and work with people on the ground to think about what are problems that we as a group face and how can we use our traditional knowledge to address these problems without then
00:33:29
Speaker
having to rely on the Googles and the metas of the world.

Exploring Luddite Ideas for Modern Challenges

00:33:34
Speaker
That I think is a promising example, is the best example that I've come across that I can point to to say, it's not either we have AI or we don't, but it's again, these sorts of very grounded projects in community that I think offer a third way forward that I think also is in the spirit of Vluttism.
00:33:59
Speaker
I think that's a great point to bring it back to not the notion of technology, but the notion of power really situated in the work of the Luddites in really all, perhaps each of those waves.
00:34:10
Speaker
So I'm wondering if you have, you've mentioned a whole bunch of different opportunities.
00:34:14
Speaker
authors, podcasts, anything else, are there some highlights or are there like your top tier hits for people to go find out and learn more about these ideas?
00:34:26
Speaker
Where would you point people who want to kind of learn more, perhaps join the third wave Luddism?
00:34:32
Speaker
Yeah.
00:34:33
Speaker
So someone who's been writing about Luddites for a while is Zachary Loeb.
00:34:38
Speaker
And he publishes under the name of Librarian Shipwreck.
00:34:43
Speaker
And that is on social media and he has a blog.
00:34:47
Speaker
And he's got a few posts where he also has like a top 15 readings on Luddites.
00:34:55
Speaker
Because I'm not an expert, let's be clear.
00:34:57
Speaker
I've done some light reading.
00:34:59
Speaker
And so hopefully if a lot of experts out there are listening to this, please take anything I say with an asterisk next to it.
00:35:08
Speaker
Brian Merchant's book that came out that Phil and Taro and I reviewed for the LA Review of Books is called Blood and Machine.
00:35:16
Speaker
I have it here in front of me, The Origins of the Rebellion Against Big Tech.
00:35:20
Speaker
That one's great.
00:35:21
Speaker
And that gives you, again,
00:35:22
Speaker
sort of on the ground historic Luddites, but then he fast forwards to contemporary times and looks at different sort of like the gig work economy and draws parallels between the kinds of organizing and resistance that contemporary gig workers are doing.
00:35:39
Speaker
Gavin Mueller has a book that came out a few years ago called Breaking Things at Work.
00:35:44
Speaker
And that's another one.
00:35:46
Speaker
So I'd say
00:35:47
Speaker
Start with those three.
00:35:48
Speaker
And, you know, if you really want to do deep dives, folks like David Noble, who's a historian and tech critic, and others, there's no shortage there.
00:36:00
Speaker
of work on the Luddites.
00:36:01
Speaker
And I think that is in part because they've become a source of interest, again, really only in the last 100 years, as sort of historians have sought to kind of rethink and reframe who the Luddites were and the kinds of political projects that they were engaged in.
00:36:17
Speaker
Because there is, I think, some debate about how organized and how much of like a political process
00:36:22
Speaker
project around unionizing and as like a political class.
00:36:26
Speaker
And so you can, if you're, if you're into that kind of stuff, there's, there's certainly reading for you to find.
00:36:31
Speaker
Thanks so much, Charles, for joining me today.
00:36:33
Speaker
Yeah.
00:36:34
Speaker
Yeah.
00:36:34
Speaker
Thank you, Nick.
00:36:34
Speaker
I really appreciate it.
00:36:40
Speaker
Thank you again for listening to our podcast at Human Restoration Project.
00:36:43
Speaker
I hope this conversation leaves you inspired and ready to start making change.
00:36:47
Speaker
If you enjoyed listening, please consider leaving us a review on your favorite podcast player.
00:36:51
Speaker
Plus, find a whole host of free resources, writings, and other podcasts all for free on our website, humanrestorationproject.org.
00:36:58
Speaker
Thank you.