Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
The point is there ain't no point image

The point is there ain't no point

S2 E2 ยท Online Education Across the Atlantic
Avatar
161 Plays17 days ago

In this episode of "Online Education Across the Atlantic," we discuss the extremes of attitudes towards technology in education, particularly around AI. Starting with Donald Clarks critique of an anti-tech keynote at the OEB conference, we dive into the popularity of 100% skepticism or 100% optimism we often hear during keynotes. We explore the importance of organizing high-profile debates to tackle polarizing issues and reflect on the success of past debates. We also highlight the prevalence of critical voices in academia and the need for more balanced discussions on AI and tech in education. Join us for a rich conversation on the future of online learning!

00:00 Passenger saves taxi from crash in rain.

05:48 Discussion on tech skepticism and AI negativity.

10:17 Keynotes seemed unremarkable; celebrity status issue.

13:40 UK trends critical of edtech and AI.

16:02 Ed tech debate polarized, seeking balanced voices.

18:55 EU bloc critical towards technology developments.

22:43 Conference experiences amplify edtech divide, platform critical voices.

26:46 Debate got personal, especially regarding AI skepticism.

29:16 Perceived big tech takeover of higher education.

33:39 Tech development orthodoxy criticized for promoting bias.

36:07 Universities haven't fully adapted to digital era.

39:29 Debate planned between author and Stephanie Hall.

Recommended
Transcript

Morgan's Eventful Trip

00:00:10
Speaker
Hello and welcome back to Online Education Across the Atlantic. We're here with you with our second episode of the second season, so we haven't gotten canceled yet. But it's great to be here with Neil and Morgan and actually to get started.
00:00:26
Speaker
Morgan, you have this big trip set up. You're going to London, you're going to the west of England, to Wales, then you're going to Berlin in a trip, then you're coming back home. Seems like things were a little bit more eventful than you expected and you're traveling. Yeah, yeah. So, you know, the trip from from London to Plymouth, where I have a former colleague who lives there and I went to go see him, that went well. And sorry about my voice, I got the plague on the way back.
00:00:55
Speaker
So it all worked well. I got on the train. the The start was a little iffy because the Paddington was jammed full of South African rugby fans on their way to Wales to trounce Wales in rugby. um But I got on the train. It all worked well. I did some work. I looked at the lovely scenery. I got off in Plymouth. We were on time. It was all awesome.
00:01:20
Speaker
The next morning when I got out of bed, my one the the wife of my host said to me, your train's been cancelled because Storm Birdie had had blown in. and So the storm came in and it was upsetting the rail because parts of the line were flooded. So I walk in the station early, like two hours early, about 11 o'clock.
00:01:40
Speaker
The guy behind the ticket says, you got to take a cab to Exeter, which I have no sense of of distance. It seemed like a long way. So anyway, i and he's not being helpful. So I walk outside and I'm thinking, oh shit, how do I get to Exeter?
00:01:55
Speaker
And how do I take a cab? Because like every other person who's trying to drink take a train that day is also trying to take a cab. Anyway, but then some woman a woman standing near me said, are you wanting to take a cab to Exeter? And I said, yes. So she she she wrote me in and two other people, two ah two other women standing nearby, and she happened to have the the phone number of one of the local cab companies in her phone so she she could call them directly.
00:02:23
Speaker
We had a cab come, a nice Welshman drove us all the way to Exeter, where we jumped on another train to Bristol, because they said they're going to route us through Bristol.

Nerve-wracking Cab Ride

00:02:32
Speaker
On the train they said, there's still one line running to London, we can maybe make it. So and we were calling ourselves the four Musketeers at this at this point. Anyway, we get to Bristol, the trains aren't running, we lose one of our number in the parking lot,
00:02:48
Speaker
And then one of the other people said, I'm not going to do this anymore. And she she bailed. So we made two new friends and we caught a cab from Bristol to London, which in case you're wondering costs 450 pounds. But the adventure didn't end there. So we get in the cab, but the A4 or whatever highway thing.
00:03:08
Speaker
is all blocked off because of traffic and things like that so that the cabbie turns off the main road and is taking us through the countryside and it's pouring with rain and it's black dark and it's a little scary and then at a certain point he loses sense I think of where he was on the road and you know because the road's covered in water, he was on the wrong side of the road and the car starts coming straight at us and he's just heading straight at this car. Fortunately, the guy of our little new Ford gang was sitting in the in in the passenger seat and he grabbed the wheel and and and got us out of the way of the oncoming car. At that point, I felt sort of somehow
00:03:53
Speaker
looked after by some higher being because not only was he a former policeman, he was a forward driving instructor. So this was just to put it wildly and to use upon directly in his wheelhouse. But for the rest of the way from Bristol to to London, we all watched every single meter or yard or foot of that road because it was terrifying. Anyway, we made it and yeah, I saw more of England than I planned to, but it was all black and dark and wet.

Missed Meeting & 'Home Alone' Scenario

00:04:21
Speaker
Yeah Well, you guys got hit pretty hard in Wales too with that storm. Yeah, we did ah Although it's one of those things where you get reporting about bad stuff happening in South Wales But it like where I am it didn't actually affect Effectors, but yeah the combination of and I was I was actually meant to meet Morgan last Monday I got ill and in any case even if I hadn't have been ill then I wouldn't been able to get a train to London and I don't fancy your kind of journey. I think what would have been perfect for your journey is if you'd have the home alone thing where you're kind of in a, ah you're in a van with John Candy and a bunch of musicians kind of making your way back to London, like that would have been perfect, I think. Actually it was even more perfect because actually ah one of one of the gang was ah somebody who also teaches at one of the university's in London. So I was asking her about the VLE and about assessment and I learned lots about assessment.
00:05:18
Speaker
How cool. As one tends to do on a weekend while holiday traveling, we were going yeah with the full planes, trains, and automobiles metaphor for her travels. Although it's too bad that you guys didn't get to meet in person.

Tech Skepticism at Conferences

00:05:33
Speaker
We'll have to figure out the next step. Maybe getting you, Neil, out to San Diego for ASU GSB.
00:05:39
Speaker
Yeah, that sounds good. or the the kind of The frequency, which which which you talk about Texas barbecue as well, is kind of you know maybe we need to meet there. Yeah, but then we'd have to go to South by Southwest. which i'll I'll go to Texas for a barbecue, but I'm not up for that conference.
00:05:56
Speaker
Well, but then Morgan flew to Berlin, not only the purpose of the trip for the OEB conference. And that's actually what we wanted to cover today was just quick commentary ah that you had put in your post over the weekend. And it was a short blurb. And I know that you have a full coverage of OEB that's going to come out before this podcast does. But you had talked about ah Donald Clark making some commentary on on basically people invited to tech conferences who are very not just I have some questions but tech skeptics and they almost get celebrity status and and a lot of people looking at AI in particular but not looking at what it might do and just looking at the negative. So I'd like to actually sort of pick up on this and then get you know get our discussion on do we have an issue where people are getting too well known for being
00:06:55
Speaker
negative on tech and AI in particular, but and we're not really exploring, you know, what the potentials are as well as the downsides. So just to have fun, Donald Clark's post on LinkedIn is titled Doomster Futurist, which some people already appreciated the name Doomster.
00:07:15
Speaker
And he talked to opening keynoteed OEB, was a German futurist, Gerd Landhard, who literally wants to viscerate AI. He does start out strong here. It was like a Nuremberg rally talk, massive widescreen graphics full of sturman-drung, encouraging us to embrace emotion, not reason. reason And then he so there are a lot of things getting into this whole approach. He gave full-on attacks on Sal Khan, Mustafa Solomon, and profiteering. But basically, Donald Clark's argument is it's become clear that the keynote speaker had no idea what this technology was or how it works.
00:07:56
Speaker
There was a predicted attack on populism and capitalism. and And then at the end, he sort of closes with, not really sure why tech conferences keep on inviting people who just plain hate tech. You wouldn't get a keynote at a pediatric conference who hates babies.
00:08:14
Speaker
it was It was quite the post, and I would like to explore it. I sort of aspire to get this to a point. And I've told my wife, the older I get, I want to lose my filters and have this ability to say things a little bit more directly. But right now, I'll satisfy us by reading it on LinkedIn. But, Neil, you were talking about you were, you said you're enjoying the back and forth. So tell us a little bit about the back and forth that was happening. there Well, I have to confess, I don't remember the exact ah the exact nature of it, but it was just one of those things where, you know to your point Phil, the fact that...
00:08:50
Speaker
you go online and you post something like that about a keynote and then, you know, it's all tagged and then the kind of keynote response. I mean, you know, to your point about kind of being more bold around that kind of thing, that would kind of terrify me if I've gone up and posted something. And then all of a sudden, the person I'm criticizing is kind of right there in my comments. So, i yeah, I didn't, i i I don't remember the exact back and forth, but I think it was, yeah,
00:09:15
Speaker
Well, and I'll just I'll jump in a little bit. The argument, I mean, clearly, Leonhardt took issue, it really affected him. I mean, he was, ah you could see the emotion under the responses. And he was saying, I'm not a doomster. I'm not a pessimist. You clearly haven't read. You haven't, you haven't looked me up. You haven't looked at my talks. I am not just a pessimist. And here's some slides. Go look at it and tell me if I'm a doomster. That was my back and forth. and then But there was commentary. It wasn't just a comment. Donald replied to him. He replied back. So there was actually some commentary, too.
00:09:55
Speaker
So Morgan, you were at the conference and we're not looking to say, you know, the whole point isn't, is Donald Reiter not, but I would like to use this as a jumping off point, but let's start.

Ineffective Conference Keynotes

00:10:08
Speaker
You were there at the conference. What was your sense in the moment watching the keynote in the sense that you had? What was your reading? I didn't perceive it to be quite so negative. It didn't make a huge impression on me, to be honest.
00:10:26
Speaker
and And I've got a broader issue with keynotes, which I've explored in my more substantive post that that zo has already been sent to you for for some edits. So, you know, I think there's a broader problem. You know, it it seemed okay, but I really, his his questioning of people who who don't like technology speaking at conferences resonated with me because I think they do get celebrity status sometimes. I don't see that with girdly and hard, but I don't live in Europe.
00:10:54
Speaker
but I think we see it in other cases here in the US. So it it didn't strike me as as that negative, but in general, I found the keynotes to be a little ho-hum, to be honest.
00:11:08
Speaker
Well, tell me, yeah, so actually tell me more, even though you haven't published yet, and I haven't read the draft, but as some of your arguments, we can duplicate content here on a podcast. So, you know, my problem, I really don't like keynotes and and neither apparently does don' Donald Clark. And I should have established the ground rules before you read out the the excerpt. You had to do it in a Scottish accent. So,
00:11:32
Speaker
Wow, I'm glad you didn't tell me that. but because Because last year he he wrote a similarly unfiltered cri critique of one of the keynotes, and they generally have three. But my problem with keynotes in general, and it's not just OEB, is that they aren't keynotes.
00:11:50
Speaker
usually they're just some work that the person has done recently, or this is what's going on at my university, or something like that. Whereas a keynote is meant to be, it's setting the keynote, you know, it's sitting setting the tone for the conference, establishing the agenda, what it's going to be, but too often it's just a random bit of thing that, you know, you're meant to be impressed by, but and and And the same was true this year. I do like the OEB practice of having three speakers at at a keynote. So if one doesn't work out, at least there's another one that you can find slightly better. So so I like that. But in general, I think keynotes are not done well and I don't know why people do them. Well, and you do have, so one of the problems I have at ah ASU GSB is one of our most important conferences to go to. And we're trying to convince Neil, that's the place we're going to meet up.
00:12:44
Speaker
Um, if he will, then we'll drive them to Texas to get some barbecue. But in any case, they go with the true celebrity standpoint, like, Oh, you're going to come to the conference because we've paid huge money for a true celebrity. And usually there's some very tangential, like, uh, I gave some money to a school or, you know, or I have a foundation that I'm donating as a charity and therefore you should view me as a,
00:13:12
Speaker
as relevant to education, but it's almost like you really have to try to figure out if it's even connected to education. So at least here, with there was a talk on AI and and education, so there was a more of a connection. Yeah, there was a talk about what's happening at ASU and a talk about Monterey Tech. So, you know, there was a close relationship with education, but it wasn't a keynote. None of them was a keynote. Yeah. Yeah.
00:13:42
Speaker
I think I can identify with some of Donald's frustrations because i I'm just sort of thinking about this from a UK context and maybe maybe this influenced the fact that maybe he kind of sort of expressed himself kind of ah you know, more critical than you maybe thought the keynote deserved perhaps. But I think we definitely have this in the UK where it's not just in terms of keynotes, but it's just the kind of general, the general kind of vibe in digital education and learning technology is for conferences and for people's kind of activities.
00:14:23
Speaker
it's a lot more in vogue to be critical, big C, about EdTech and now AI than it is to to find people who are willing to not even present a ah polarizing Uber optimistic slant, but even just to to find anyone who can speak positively about the work that they're doing in EdTech or AI and the difference it may or may not be making, it's really difficult to find those kind of things. you know A lot of the conferences that I see in the UK, they are really stuffed full of people who want to talk about the potential problems and you know a whole range of maybe more kind of political and social dimensions of these things. And you know no no one's decrying
00:15:18
Speaker
the existence of those kinds of things. But I think for me, there's a real imbalance between between that and who's platformed, particularly ah on keynotes and conferences versus those people that are willing to to stand up and say something more positive or even just share something that they're working on. I think the nature of that community makes those kinds of peoples people more reluctant to to to stand up and be more present a more optimistic slant. And those opportunities to speak just don't exist because you know there is a slight ah slight dominance for those that are willing to you know adopt a critical stance. And it's it's almost a kind of a religious thing, I think, sometimes, the way it manifests itself.
00:16:07
Speaker
and it is difficult because when you're critiquing something that's critical and you know that that's kind of a difficult position to be in as well because you know there are problems with their tech and there are problems and there you know they does need to be the other side of things and investigation but i think Yeah, I think it is.

AI Discourse: A Call for Balance

00:16:29
Speaker
I think it is a problem and I think it's it's it's something that's become Unbalanced and it's really difficult now to find You know even that kind of middle ground where you don't have Hypercritical and you can find the ultra optimists and both are just massive turn offs for me I want to find someone in the middle if I can kind of put it in that way and it's really hard to find those
00:16:54
Speaker
those voices and for for those voices to have an opportunity to speak at some of these kind of events really. Phil and I were at went to a virtual thing yesterday or the day before and it it was it was as Phil described it a bit kumbaya, but it was it was a group of people who who who belonged to something called the problem solvers. I want the problem solvers of EdTech, not either extreme of things.
00:17:20
Speaker
Well, I will say ah what we're describing here, there is a little bit of a part of the issue is there's a difficulty selling that. the more that you're typecast, you're Sal Khan on one side, or you're somebody who's a massive critic on the other, it's easier to understand and then people want to invite you to speak, whereas nuance is pretty hard to sell. And I think that the way keynotes are done, that that's part of the issue. But to make this uncomfortable, let me take this a little bit further. You know, you're talking about the UK and Europe,
00:17:55
Speaker
This isn't just a conference issue. I would argue that this is an issue in the whole society and its reaction to AI in Europe. it's like It's like the European Union in particular, and I know you guys have left that. But I mean, there seems to be a whole thing, like the whole goal.
00:18:14
Speaker
is to talk about how bad AI is and to come out with regulations before we've even seen what AI can and can't do. And so the whole regulatory structure that everybody's patting themselves on the back for in the EU is going to prevent most of Europe from even playing in a role in the future of AI, what gets done, what's possible, what should be done. So this seems to me to go at least in Europe, well beyond the conference area. It's more of a political point of view, and I've seen a lot of criticisms of that as well, but there seems to be a lot of back padding in the EU about we're reining in AI.
00:18:55
Speaker
Yeah and i think I think that probably speaks to the way that kind of different groups organize themselves around that side of things and obviously that's an example of a kind of big political block that you know if I was going to kind of characterize the EU in general terms it's probably more on the critical end when it comes to technology as you've kind of alluded to Phil so you know it wouldn't surprise me if you think if you think about the character of a ah government or a body And, you know, that that that kind of move does not surprise me. I can't necessarily put it into words why that's the case. But I think just on the back of kind of a feeling of their general disposition towards technology and new developments and of the maybe other legislation that I can't kind of bring to mind right now, it doesn't surprise me that that political bloc has kind of acted in that way. Like whether that's characteristic of Europe as a whole, I think is kind of debatable.
00:19:54
Speaker
you know, because then you get into the extent to which that is representative, that is a body representative as UK as a whole. And maybe that's partly to do with some of the problems that we've had in our country about people's perceptions of it.
00:20:07
Speaker
But yeah, I mean, i think I think people are sort of organizing around particular groups. And I think to some of the earlier points that I made, I think because there's still a lot of uncertainty and people not sure about AI, I think in general terms, you know, you I think to a certain extent it's human nature to want to identify yourself with a group.
00:20:31
Speaker
And I think just bringing it to UK higher education, the most prevalent group or the most prominently visible group is likely to be a group that's more on the critical side of these these things. So in a kind of a maelstrom of AI where people are still not really sure about what it means and what stance they should take, you know that's the easiest route to identify yourself with in the UK, I think.
00:20:58
Speaker
Yeah, and part of what I think we're getting to, so you have these cultural issues we're dealing with. It's not just ed tech, it's not just conferences, although we see it in conferences, but it's so easy to be part of the fully critical, I love your phrase you mentioned earlier, slight dominance. I'm gonna use that in some phrase. But you have this, it's not just that it's polarized. I think what we're saying, or certainly what I'm saying is,
00:21:26
Speaker
It's polarized on the AI can do anything and AI is awful. But there's a strong preference for the critical view right now. It's a lot easier and safer socially to try to shoot down you know, AI, and it's almost like I forgot who made the phrase talking about, you know, it's avant-garde to be critical against AI, except for we're no longer avant anymore. You know, it's like, because it's actually safer to have this view. And so it's polarized, and it's too critical. And
00:22:03
Speaker
My argument is that's a problem, is we're at the period where we need to figure out what AI can do, what use cases can help students, what are the real problems that we need to watch out for, and therefore, how do you find examples and see the technology develop in a way that can provide some benefit because we're not going to get rid of AI. We need to find out how to battle these competing forces. And so I guess the point is that certainly at conferences is one example. We're not getting that. We're getting one extreme or the other.
00:22:43
Speaker
Yeah, i think I think it depends a little bit on the conference. You know, last week we talked a little bit about, and I know it's a slightly different thing, but my experience at the Canvas and D2L conference, and then there were examples. that i know I know we're not necessarily talking about that, but I think that's part of the problem because if you only ever kind of platform these things and people who are willing to talk about it at edtech vendor conferences, then it kind of it kind of amplifies that divide, really. I think, you know, I was i was looking at some things that are happening in the UK. There's different trials that are going on that just are running. And, you know, in the context of this conversation, I think about some of the ed tech conferences here. And I think it's very unlikely that someone involved in that kind of project would present
00:23:34
Speaker
in a sort of significant way at some of the conferences. you know If you had a choice between that and some someone who is more of a critical voice, then you know I think that they're going to win out. So you know there are things happening, but I think it's just, yeah, it's it's it's a little bit about opportunity and what's what's kind of vogue at the moment.
00:23:54
Speaker
But I think it's, I do think it's really, really problematic. And I don't know what the solution is, but I think there is, I sometimes wonder if there's, I don't know which, I can't remember which US politician uses phrase, but I always wonder sometimes if there's a silent majority of people in higher education who really want to know what is actually happening around AI, you know, there's been plenty of Surveys saying you know lecturers want more training and information and don't feel supported so I sometimes wonder if there is this kind of latent appetite for For something more practical and more positive about using AI to help in some way and those presentations I don't think have to be you know Rosie eyed and and I don't think they often they often would be but yeah I I'd love to know, I'd never find out, but I'd love to know like, whether that's true or not. And and maybe, you know, OEB, because like, there was Gerd Lenhart, and then there was Kyle Bowen, who is very much a pro AI person. So maybe they were thinking if you added to two sort of versions of the extreme, you got you got a middle ground, but
00:25:07
Speaker
Well, then why not do a debate like I, uh, this isn't a conference, but I recently posted on X about, uh, Ethan Malik is really getting, uh, personification of AI is going to change everything. And, uh, there's some great possibilities. He obviously has some nuance, but he's very much the optimist. But then you have Dan Meyer from, uh, what's the name of Dan's company? Math worlds. Yeah. Math world.
00:25:36
Speaker
who it seems like every most of what he writes is the problems with AI. And he always quotes math faculty as if they're saints. Like, oh, anything that math faculty don't like, well, take them as pure in heart. And so AI is upsetting them, therefore it's bad. And I was pointing out, so I was saying a similar concept. Why do we have to have the extremes? They both responded in my thread But what I tried to go them into did not succeed is say, well, let's have a debate. I'll host a debate. I'll moderate you two. So go back to the, you mentioned Morgan at the conference, not just two separate keynote. Why don't you bring in Kyle and Gert and have a a true moderated debate between them? Would that be useful? Yeah, it would. And and actually, so OEB has a tradition of debate. So they always have a debate. This year, it was actually on data.
00:26:37
Speaker
What's it a true debate? Yeah. Yeah. And it, it got a tad personal this year. As personal as Don's, uh, LinkedIn post. Not quite that personal, but it got fairly personal. You know, it was Martin Bean and Ellen Wagner on the one side and, um, Jane Bozarth and the head of the Ada Lovelace Foundation on the other side, but it got a, it got a bit personal. And in some of the questions from the audience.
00:27:04
Speaker
some of the extremes would come out. But, you know, they have a tradition of that. So maybe next year they can do that. But I still think, you know, just going back to the Dan Meyer thing, you know, I see it mostly, I mean, I see it with him, but see it mostly in terms of especially writing faculty, like there's a strong brand of AI is the worst thing ever. I mean, one of the recent op-eds inside higher ed was burn it down. you know it's it's It's like, you know there's a strong tradition of nihilism in in in people's response to it. And i I remember that from from data analytics days, when that was the big issue. like I remember having a
00:27:49
Speaker
knock down drag out fight with a senior administrator at ah at a major university here who I deeply respect and who I know is wicked crazy smart. But they were arguing that it would be better for people for students to fail and drop out of university without a degree than for us to collect data on them.
00:28:11
Speaker
wrong position. I do think as well as another factor that, and and I think this is certainly true in in the UK, that undoubtedly must have an influence on lots of responses, not just in AI. And that's just the the complete perilous state of higher education here. There's so many job losses and voluntary redundancies and you know all of the stress of of that. And I think You know that There there is that sense where there's grievances and people are looking for somewhere to put put that and there's kind of some of the usual suspects are You know, why are we paying all this money to a big tech company or a consultant? You know ai's gonna rob us of things Further and and so I think that's a very important context to all of this stuff i'm not saying that that's the the sole driver, but I think that that is
00:29:02
Speaker
I don't think you can sort of analyze and think about what's going on without thinking about that that wider wider context and I mean you can even equate this further back to the kind of pandemic and the challenges challenges there and you know it's interesting the inside higher ed piece that you

Media's Role in Tech Polarization

00:29:21
Speaker
forwarded round fill around the kind of response took to all this because that that was an interesting dimension of you know big tech is suddenly in control of higher education and I've seen I've seen a number of articles and even research papers that have that kind of have that kind of narrative where you know tech is overtaking higher education. And it's I've seen it in research and such that claims have been you know grossly exaggerated. It's probably a kind way of putting it. Actually, some of the research has just been factually incorrect, staggering, really, given there's a peer review process. but
00:30:01
Speaker
You know, there is that sense that tech's gonna take over and I think that is that is intertwined entwined with some of the kind of issues in higher education recently as well. But I kind of read an article like that and I think, why can't we have a grown up conversation about outsourcing? I mean, like, we all rely on a third party of some description in our lives and in our businesses and in our work and it it feels as if You know, and yeah I think probably Donald Clark, as I think may have mentioned this even in the past around, you know, we're talking about we're critiquing the outsourcing of this thing, but we're not critiquing the outsourcing of catering in the university. yeah And I know, obviously, there's difference there. Of course, there is. But I just think why can't we have a grown up conversation around this?
00:30:49
Speaker
So going to the article, it's inside higher ed. The article is, big AI companies need higher ed. But does higher ed need them? Well, first of all, the whole premise. Yes, universities, you can have the choice of avoiding big tech and AI. That's delusional. But if you read through it, you know it's talking about how important universities are for AI, but why don't we just not ignore them, but let's go a different way. So the author ends up almost advocating, why don't we do university initiatives that are AI-based? And he mentioned one, sorry for the pronunciation, ah to ha Tahiki Media. I'm not sure if somebody can correct me. It's in New Zealand.
00:31:36
Speaker
um And he's talking about that approach as people seeing themselves as guardians rather than owners of the language tools, so more of a collaborative approach. I'm all for a lot of stuff like that. It's a mayoral media organization.
00:31:53
Speaker
But part of why we can't have adult conversations in my mind is if you read deeper into this. So he criticizes one group called Cogniti. It's a startup AI company. And they do have a position that seems naive. We can become a virtual instructor. You can clone an instructor and they can do this. And I think those claims are going too far, but it's just one startup company.
00:32:21
Speaker
But, and he criticizes them as this is why we don't need big tech and AI. Cogniti is a small company that was started by educators. And they're the ones with the most outlandish claims that he's going against. So he's trying to critique these companies, but they're sort of doing what he's espousing. Why can't educators take ownership of it? And then the second thing is, if you go look at him, the author, whose name is Colin Bjork,
00:32:52
Speaker
He's made sort of a mini-career of going around and writing op-eds and giving talks about how awful Big Tech and AI is. So he's trying to get, so like in my mind, celebrity status as you know the person who's really against you know, AI. And that's the type person sort of circling back that too often get invited to do keynotes at conferences. And so it's like easier to do that. So part of my argument from this is why can't we have adult conversations? Some people are making careers out of not being adults, to be quite honest, and it's too easy. And the second thing is, if you actually trace through what they're saying, sometimes there's not a coherent argument there.
00:33:39
Speaker
Yeah, and I think that's one of the interesting things, isn't it, that that, you know, I think often people want to present the kind of, there's there's this kind of outside negative bad actors in the and manifested in the kind of private sector or the ed tech side of things.
00:33:56
Speaker
But when when the kind of the people developing the technology are kind of coming out of universities, I think it kind of shows it up a little bit for what it is in terms of it's a very close-minded orthodoxy.
00:34:11
Speaker
of of of a way of thinking about and i mean it's ironic that actually a lot of this movement is kind of under the banner of critical because i think they're critical about everything other than themselves and to be able to kind of evaluate their own position as well and and actually be kind of reflexive about that you know i ultimately that's what we should all be doing because we all have biases of some way shape or form but i think that's the thing that's why i mentioned it was almost religious because you know There is that kind of doctrine and that dogma around the doctrine that makes a mockery of the sense that you're you're critical. And even even you know people within your institution, you know if they're not
00:34:56
Speaker
of the same mind, then they're kind of lumped into this kind of evil bracket, really. I think i think it's just all so polarizing and it's so unproductive. But I think, you know, to go back to the keynote side of things and even this peace side of things,
00:35:11
Speaker
You know, it's those polarizing positions that often are the things that are picked up by media and there are areas of interest for kind of keynotes because there's a bit more of an edge to them than maybe something that's more in the middle ground.
00:35:29
Speaker
But you mentioned religion, that also gets into followers too. And that's that's a big issue. It's not just, you get followers of these people and that makes it worse. I think also a big part of higher ed is, you know, came of age in a period that was really an outlier. You know, when there was a lot of money for higher ed, it was growing and things like that. And they're having a lot of difficulty adjusting to a different reality, you know, so.

Higher Education's AI Challenges

00:35:59
Speaker
I think a lot of that nihilism comes from that perspective. It's like, you know, we don't need AI, we should just hire more academics. Yeah, I think that's a really interesting dimension of it. I think some of this is is is the legacy of universities still not having adjusted to the digital revolution, if I can put it in those terms, I think, and and kind of what that means for the academy. I also sometimes wonder if there's, in terms of kind of voices and you know mentioned about followers, I do feel sometimes there's a little bit of
00:36:36
Speaker
those that were kind of optimists at the dawn of the internet and now there's a little bit of kind of bitterness I think that influences people's positions because maybe that optimism didn't kind of play out so well and now people are a little bit bitter and I think that influences some of the narrative around this kind of thing. I think i don't think that's a big thing but I think it's a small thing where ah that I kind of notice a little bit in higher education, you know, as as well. I think there's a whole whole kind of melting pot of of influences, but I agree with you guys in terms of there seems to be a bit of an imbalance really on these things.
00:37:17
Speaker
So as we wrap up, I guess, yeah, we're not going to change the culture. I don't think we have that kind of power within

Advocating for Conference Debates

00:37:24
Speaker
this podcast. So part of, we can't just belly ache about it. If I take anything out of this conversation, it's given the realities of what's happening in Europe and universities and the polarizing nature. One of the biggest opportunities I would say is why don't we do more high profile debates?
00:37:47
Speaker
So get you know people on two different sides at a large conference and have a good moderator that can really explore the issues. So to their credit, ah last year the Noodle had the post-OPM conference and I got the chance to moderate between Bob Shireman, who's very much a huge skeptic of OPMs and trying to be behind the movements to put them out of business. And then you had ah the other Bob, who's a lobbyist for conservative cases, she blocking his last name. I apologize for that. But the other Bob and he very much is helping OPMs
00:38:33
Speaker
And I moderated them so that I could ask, you know, both of them questions and explore the ideas. That was a lot of fun. That's what I think could be done is have more real engaging debates at high profile conferences to explore the issues, because we're not going to get away the tendency to celebrity for big skeptics. But let's at least get the debates out there. The key is good moderation, which even though OEB had a debate this year that it was not well-moderated, and that's my unfiltered opinion there, but that the guy could not even pronounce the names of the participants right. Oh, well, what does it say about me that I can't remember the other Bob's last name from the conference? Well, I think you're just itching for some debate moderation gigs, Phil, to be honest. I can just... Oh, I did have a lot of fun doing that last year. I did enjoy it. He's a very good moderator, as we said. Yeah, I think you do a great job.
00:39:29
Speaker
so I do that. Oh, I am. And this one, I'm hoping for ASUGSV. Stephanie Hall used to be at the Century Foundation. Now she's at Center for American Progress. She's one of the main people secondary to Bob Charman, who's behind the regulations against online education and OPMs and stuff like that.
00:39:50
Speaker
And I had critiqued her online and we had a back and forth on X and I said, why don't we turn this into a debate? And so there's actually, they said they're going to put us at ASU GSB and have somebody else moderate myself and, um,
00:40:07
Speaker
Stephanie. And I hope that happens. And the person they're talking to moderate is Goldie Blumenstick, who just recently finished up from the Chronicle of Higher Education. So I'm hoping that happens. So I guess that's what I'm advocating. Let's have some more real debates on the tough subjects at conferences.
00:40:26
Speaker
So that's our call for action for people. But it's great seeing you all. Again, it's a shame that you two still didn't get to meet in person, but we'll we'll work that out in 2025. I think it was something about that that that kind of prospect of a hug. I don't know that probably did it. I don't know.
00:40:46
Speaker
Did you really take a cab or was that an excuse to avoid the hug? I invented the whole birdie thing. yeah yeah I've made it all up. We'll make it happen in 2025. Well, it's great seeing you all and thanks for the discussion and thanks for all of our listeners being there. Thank you.