Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
/design culture: too many cooks image

/design culture: too many cooks

The Forward Slash Podcast
Avatar
24 Plays3 days ago

How do you build user-centered digital products in a world obsessed with trends and AI hype?
In this episode, UX researcher and digital psychologist Dr. Nick Fine joins James to share expert insights on human-centered design, behavioral science, and the real meaning of user experience. Learn how to apply UX research methods that drive product success, avoid common pitfalls in agile teams, and use AI tools without sacrificing product integrity.

Recommended
Transcript

Introduction to User-Centric Design

00:00:02
Speaker
How can you build anything without knowing without having eyes and ears? So you shouldn't just open your eyes and ears once a week or once a month or every six months. You need to look where you're going all the time. It's constant correction. And that's what the power of user centricity and research into design in an agile way.
00:00:20
Speaker
That's ultimately what we're talking about.
00:00:31
Speaker
Today have with us...

Guest Introduction: Dr. Nick Fine

00:00:40
Speaker
today we have with us All three of that in like one person, right? All three of those things in one person. and ah Dr. Nick Fine joins us today. ah Nick is a digital psychologist and user experience researcher with over 15 years of hands-on experience shaping critical digital products across government, enterprise, and agency environments.
00:01:01
Speaker
Nick holds PhD in human-computer interaction, but don't let the doctor fool you. He's the guy with his sleeves rolled up doing the work. From regulated industries and clinical systems to multi-million pound.
00:01:14
Speaker
That's right. He's from across the pond joining us from where where are you at in England? i'm Just north of London. Just north of London. Yeah. London suburbs. multi-million pound government projects, and even the ethically gray areas of digital design, Nick has seen and done it all firsthand.
00:01:31
Speaker
He's known for blending behavioral science with evidence-led design, championing scientific standards in UX, and reminding the field that research isn't just about asking questions.
00:01:42
Speaker
It's about asking the right ones and interpreting them with rigor. When he's not deep in user research, Nick is sharing insights on stage in podcasts and with teams looking to make their products more human-centered and science-driven.
00:01:55
Speaker
Please welcome to the show, Dr. Nick Fine. Thank you very much for having me, James. Very, very pleased to be here Yeah. um So i um i got to hear about the stage part.
00:02:05
Speaker
what what What do you do on stage? Well, no, it's just I get invited to conferences every year. um And it's just ah another channel to be able to share and educate and entertain, you know, at the same time to try to just move the needle forward to try to improve our working lives and just, you know, more job security, build better things, build things the right way.
00:02:27
Speaker
Gotcha. You're not building out in song and like Sweeney Todd or anything like that. Okay. You really, really don't want me to do that.
00:02:35
Speaker
that's fair I have some talents, but singing ain't one of or dar ain' one of them. ain't it yeah I love that you, uh, you, you kind of pointed out the, you know, don't let the doctor fool you in your bio. That's, uh, there is that perception that whenever anybody gets the, you know, that, that doctor in front of their name that, oh, they don't, they don't actually work, but you're, you're the guy who actually does and get work done. Right.
00:02:59
Speaker
Absolutely. Um, it's big how people display their PhD has become a bit of a bone of contention. I've only kind of kept it that way and been Dr. Nick Fine rather than Nick Fine PhD as a signpost for about 10 years on LinkedIn.
00:03:13
Speaker
um And because I am a practitioner, that's why I have to say things because there's lots of people with PhDs who aren't ah doing the kind of the practitioner level work. They're doing lots of leadership and management stuff. So I just need that to be clear that, you know I'm not, I'm not stuck away in some white ivory tower or in a leadership committee or whatever that thing looks like. I'm sleeves rolled up on the ground doing the work.
00:03:35
Speaker
Fantastic. So you, yeah, you're not that ivory tower architect like we have in in the the software development industry that that they're so disconnected from the day-to-day stuff that they're, you know, their, their opinion and gets less and less valuable over time. So that's good.
00:03:50
Speaker
That's good. I like that, that you're staying close to the, to the work. That's fantastic. We were talking a little bit about, ah agile and, and, and product management and, and those sorts of things. And those are, those, those are really interesting topics to me and and very, uh, close and near and dear to my heart.
00:04:07
Speaker
But how has your experience been in the, you know, we've we've gone through this transition through the years of, you know, we've been, we did project management back in the day and now, you know, there's, you know, now we're doing product management. So, you know, and everybody's very, very keen on, we got to make that distinction.
00:04:23
Speaker
How's, how's your experience been in that ah as we've made the transition?

Critique of Project to Product Management Transition

00:04:28
Speaker
I think it's been an absolute dumpster fire. I think it's been an absolute, I don't want to swear, but it's been not a a very poor experiment, I think, a very poor public experiment, like a form of ah like a clinical trial where we thought, you know what, project management, okay, we've been doing that for since forever.
00:04:45
Speaker
ah We've got Prince 2 people and we've got all these methodologies and ways of doing things. Let's just bin it all. Let's replace it with somebody young and experienced in a brand new practice that has got no heritage, no provenance, no rule set, no no anything.
00:04:59
Speaker
and let's let's let's put them in charge of product teams. And as a practitioner, I've worked for lots of, over the past past five to eight years, I've worked for lots of different kinds of product managers, right?
00:05:12
Speaker
um Some incredible product managers, like visionary entrepreneurs who roll their sleeves up and are truly kind of unicorn-y. But I've worked for also the majority, which tend to be When I say entitled, I don't mean it like that, but I mean some kind of you know, they have this kind of, this need to be senior, right? And this need to be in charge.
00:05:32
Speaker
And it's a problem because they don't have the practitioner experience. They the product management experience. They haven't got the strategies. They don't really have, they don't they get their guessing.
00:05:44
Speaker
And it's really obvious. And if you're older and wiser and more experienced, it's very hard to work for those sorts of folks because Because they ask you to do things which are ridiculous or inappropriate or like, what? So it's''s you can't you're trying to educate upwards or sideways, and it's really hard because egos get bruised. and You know what I mean? it gets it gets The younger generations don't like older folks telling them that they're not doing the job right, um despite the fact that they're not doing the job right right. If everybody's doing brilliantly and making money and building great, but profitable products, then I'm just a grumpy old man.
00:06:22
Speaker
but that's not the case. could That is, yeah, that's not the case. The case is we're not building right things. Teams are being fired or made redundant in huge volumes and we're genuine, genuine, genuinely and generally not delivering the value to the business that historically UX and product teams have done.
00:06:42
Speaker
And I think that's a fairly safe thing to say too. Yeah, no, I think, I think that's, that is definitely fair. Now, This is an interesting concept. So do do you attribute it more to, you know, just the the way the world is today? or is it or Or do you think there's something there about, you know, but that that that project management mindset and the and that the formality of that from the past that we've lost a little bit of that in this this new kind of product world? is it Is it a generational issue or is it a methodology issue, do you think? What do attribute more to? It's it's it's a multifactorial problem.
00:07:20
Speaker
So you've got the fact that there is generational learning differences. OK, so there isn't the knowledge transfer. People don't become apprentices or you know mid-weight. No one is mid-weight, has ever been mid-weight in recent history.
00:07:33
Speaker
everyone just goes straight from junior or internal unemployed to senior within two years. So there's none of that learning and passing down the principles and the strategies and the experience.
00:07:45
Speaker
So that's one. Uh, number two is management with a capital M died many moons ago, many moons ago. um When I got into management, I was made to go on courses, Dale Carnegie and these sorts of courses, to learn how to do technical leadership, sales leadership, all of these various pro modules that I did.
00:08:07
Speaker
Because if you're going to manage people and money, you should probably have a clue about what you're doing. Probably a good idea. Probably be a good idea. But these days, again, everybody, the youngsters, see themselves as the the finished article immediately.
00:08:22
Speaker
There's no growth mindset. There's no learning from

Misconceptions of UX as Visual Design

00:08:25
Speaker
experience. And because management died, that means all of this bad stuff that's happening is allowed to happen.
00:08:34
Speaker
Right. it Historically, if we started doing stupid stuff, if somebody in the management hierarchy would stop it. Right. But now it just it just it's it. You canvas opinion, you politic people.
00:08:45
Speaker
And as long as more people agree with you, you can do the wrong thing.
00:08:50
Speaker
With backing, yes. Yeah, exactly. And it also means, and this is kind of core, there's a lot of finger pointing and and and and blaming that that that distracts from the lack of experience and the lack of productivity or the lack of outcomes.
00:09:04
Speaker
So by it being in a committee, it it safeguards the individual because you're in the herd, you're in a swarm. And this is what we see a lot of today, people swarming in committees and committees are killing our industry.
00:09:18
Speaker
And we we talked a little bit the last episode we had that we recorded. i don't know if it's necessarily coming out in that that order, but ah we coined the term the Kardashian effect. So it seems that, you know, kind of...
00:09:32
Speaker
um Just kind of our culture ah is very much all we see is this this fakeness through social media and everything that everybody seems to be these experts in these things where they, as you said, they didn't roll up their sleeves and do the hard work to learn the thing, but they can put forward a front that makes it appear as if.
00:09:52
Speaker
oh, they are this expert and and authority on whatever topic it may be simply by controlling the message that goes out because that's all you see. You don't see the you know that they're not actually delivering things or whatever or whatever the case may be.
00:10:07
Speaker
um i think that's probably a a big part of it as well. you know We live in this post-truth world, and that's not anything radical or wild. That's actual fact. It's about controlling the narrative.
00:10:18
Speaker
Let's look at UX UI. Those guys were all art and creative directors. They had nothing to do with UX and had no user experience, ah ah know how to no experience whatsoever in that field. They were all art and creative directors.
00:10:34
Speaker
Then they saw the UX thing going down or really jealous that these guys are doing design and we should be doing design and came steaming in. But they didn't listen. And and the whole point is is that um they controlled the narrative in numbers because there were more of them than there were us.
00:10:49
Speaker
And so UX UI was allowed to breathe and live. And we created this interaction, visual interaction design thing that was no different from the design that was already there.
00:11:01
Speaker
It was just a Trojan horse with the word UX in it to get in. But UX UI has a life of its own now. it should It should have died years ago. There are people who still wear that badge because the narrative, like flat earth, is still floating around.
00:11:15
Speaker
It's wrong. it's it's It's just flat out wrong. But we can't stop them and no they won't listen. Well, it maybe... So when you hear like UX, because I'm guilty of this as well, you know, like because I'm ignorant, right? um When you hear, or I have been ignorant, when you hear ux you know, a lot of people assume probably because of this phenomenon you're describing that, oh, that's just making pretty things on a screen. Like that's that's just, oh, make the button pretty or or whatever, right? That's a lot of the...
00:11:44
Speaker
the perception. So will help me understand or even help the listeners understand like what when you when we talk about user experience, what does that

Defining User Experience

00:11:52
Speaker
really mean? It's not about the visuals. What what is it? What is it really?
00:11:56
Speaker
What is it really? And this is one of the hardest problems and why we're in this state where UX was as guilty of of not having a clear definition. And it's very hard to define. So I've defined it. There are two levels to my definition because it's hard to define like a high top level and then a down low level. i'm going to try and get this together for you. It's basically you User experience is the conversion of... Sorry, it's the creation of optimal experience.
00:12:22
Speaker
Right? But that's the high-level one. The low-level, it's the creation of optimal experience through the conversion of user needs into design solutions. Right? So UX, in a nutshell, is research into design.
00:12:35
Speaker
Okay. It's the research into design. It's designing in the user needs... into whatever you're designing. it's that's That's user experience as close to the as you can find.
00:12:48
Speaker
Now, what makes UX or user-centric UX so powerful is that um
00:12:58
Speaker
that research into into design is unique or was unique in the entire design world, in the entire IT world, entire digital world, right? There's lots of different forms of design.
00:13:10
Speaker
But UX as a research into design was more than just design, a lot more because it's behavioral design. Right. And so when you just say it's about pretty things and it's about Figma and all this garbage, you're like, you're literally talking about the wrong practice.
00:13:27
Speaker
You're talking about visual design. You're not talking about my or our practice of user experience or user centric user experience. And that's where we're at today. And that's what we've been trying to do so hard for the past nearly 10 years now is just get back to user-centric UX.
00:13:42
Speaker
Stop with the visuals. Understand that it's research into design. Now, this is the important part. Historically, that was all in me, one person, or and or you know the architect the UX architect or UX consultant role.
00:13:55
Speaker
But that's 10, 15 years ago. The market didn't support it. with Skill shortages, unicorns, all this stuff. So ah UX UI was one way of solving that problem, unfortunately.
00:14:09
Speaker
um But it's split. The role's split. And so you've got you user researchers and UX designers. And we work together. And that's the UX team. Sometimes the researcher is the product manager in a really small company, you know?
00:14:22
Speaker
which is less than ideal, I should say, for the record. um But that's really what's going on. Now, we've lost that research into design. We've lost, I'll take it back to the original question, because we've lost the the cadence, we've lost the delivery methodology.
00:14:40
Speaker
There's no way to get this insight into the way of working, into the progress So you get naive product managers going, hmm, we should do some research.
00:14:51
Speaker
They either do some bad research themselves or say to the researcher or get you know get bring in a researcher, do a thing, right? Which is an event-based research rather than like just BAU research, right? It's just we do research all the time.
00:15:08
Speaker
So you mean it's more of a continuum or can continuous activity throughout the product? and people but Okay, I got you. Yeah, yeah. like but When I say a rhythm, like how can you build anything without knowing, without having eyes and ears?
00:15:19
Speaker
So you shouldn't just open your eyes and ears once a week or once a month or every six months. You need to look where you're going all the time. It's constant correction. And that's what the power of user centricity and research into design in an agile way.
00:15:35
Speaker
That's ultimately what we're talking about. So yeah this is an interesting thing. like that For me, ah my my brain kind of goes, is so so our world is is highly, highly digital these days, right? Most of the things we we interact with, apps on our phones and websites and all those things.
00:15:51
Speaker
Did this concept of of this like kind of user experience, is that a is that a new thing? Or did did we do this sort of activity with physical products, you know, kind of back in the day before we went all, know, everything digital?
00:16:03
Speaker
Ultimately, this is about iteration. And we've probably always iterated, but historically, things all forms of development be it digital or food or product or whatever they tended to be very waterfully right you tended to have a big upfront discovery type thing where you

Transition from Waterfall to Agile Methodologies

00:16:19
Speaker
did got all your requirements and that took a year or two you got those into a ridiculous document called a business requirements document and then you got a functional specifications document and there were toll gates and it was it was The reason I haven't left is because we pulled it all out.
00:16:33
Speaker
It wasn't a lot of fun because but in those two or three years of waterfall activity, the world is changing, right? And so waterfall doesn't support change management and you have change requests.
00:16:45
Speaker
And that's when fights break out between client and just is, yeah, it's just, you know, that was within scope. That's not within scope, but how can you build it without that? You know, it's just horrible, right?
00:16:58
Speaker
So agility, being agile, working in iterations.
00:17:04
Speaker
When i started in UX, agile wasn't truly born. It wasn't born in my world. It wasn't ubiquitous, right? So I was doing waterfall UX, but we still iterated like mad. It's just you didn't call it like an agile iteration or a sprint-based iteration. It was, I do a round of user testing. I learn stuff. I fix my prototype. I retest it.
00:17:28
Speaker
right nothing as that's as formal as as it gets and you do that as often as you need to until you until it tests well you know so to answer your question is that where we used to do things i'm not sure but i'm pretty much sure that anybody building anything well like you're prototyping a car right a lot of it has to happen inside because it's secret but then you dress it up to look like your competitor's car and take it for a drive you're testing it you're testing it in certain conditions or what i'm saying there comes a point where you have to right? And you have to then do the iterative thing. You can't just build in a garage and then release to the world.
00:18:01
Speaker
And that's what we see today in in digital people building top down, not seeing a user releasing, wondering why things don't go well.
00:18:12
Speaker
Yeah, the the startup world seems to kind of get this idea. seems like I know like a lot of us, you know, software engineers, you know, we want to do everything right. We got to have 100% test coverage and all these things. We want to do everything to the nines, right?
00:18:25
Speaker
But you know, kind of in that startup world, you have to, you you have to get that knowledge from your users. You need to get your product to the users as quickly as possible and understand how is this helping them? Is it how is it, is this, you know, helping someone's life? Is this making their life better?
00:18:39
Speaker
Are people, are they going to be willing to pay for this thing? Right. So kind of the startup world, I think they're, they're a little closer to this mindset than like your big enterprise, right? Big enterprises tend to maybe go the other direction.
00:18:52
Speaker
You know, it really does depend. And I, because I want to agree with you wholeheartedly because that would be easy and it's, no one would really argue. ah There are some, found I can't tell you how many founders come to me every year saying Nick help, or I've been referred to through my networks. So, and say, well, can you help Mike, this person, they just spent 2 million pounds and the agency have screwed them. And you know what mean? They've gone live and it's all gone horribly wrong. And it's like, well, has has it seen a user?
00:19:18
Speaker
No. So you literally went into your cave, you built your genius visionary idea, and guess what? It doesn't match with the real world. Well, then shouldn't have spent two million pounds. I'm sorry. That's just irresponsible.
00:19:31
Speaker
um yeah Right? so But there that there are people who do it really well, and there's people do it really badly. So, yeah. The startup world, I know I want to say there there's some great stuff going in startup world, you know, like really genius next-level stuff.
00:19:46
Speaker
But I'd say the majority is probably, you know, building what they think, having a nasty wake up call, raising more money and then doing things the right way. Gotcha. Okay.
00:19:57
Speaker
So it's not a uniquely enterprise issue. It's not a uniquely startup issue that they're, they don't have it all figured out either. So it is a pervasive kind of issue you would say.
00:20:08
Speaker
Yeah, man. I've been at some like supremely agile user centric, big, big enterprise brands.

Challenges in Adopting Agile in Enterprises

00:20:16
Speaker
Right. Um, And I've been at the same where it's like, you know, being in a backwater and you're like, how can you be this size with this amount of resources and fumbling things so legendarily badly, like epic levels of bad wastage, stupidity, double backing, repeating work that another team's doing on the other side of the building, all the usual shenanigans.
00:20:39
Speaker
So it sounds like, i mean, at first we were kind of talking about this, you know, the, the, the project mindset versus the product mindset. It does sound like you would agree with it, that kind of the the notion of agility, using it not not agile as the, you know, the capital A agile, but the but the notion of having agility is important, especially with with UX.
00:21:01
Speaker
um But, you know, we figured that out, right? We've got all this stuff figured out, the agility. We got we got frameworks out the wazoo for agility. We know how to be agile, right? You know, we know how to do that. We've got Fibonacci poker party thing, you know, poker planning. We've got all these things. That's all figured out. We've got that.
00:21:20
Speaker
We got that all figured out. Yeah, right. So yeah, I know you're being you're being ah sarcastic and that's a good, bit just for the audience listening, you know, James is being highly sarcastic because Fibonacci poker, and I can't tell you, there's been about three or four gigs of mine in the past five, 10 years where I'm sitting down with a delivery lead or a product manager ah playing Fibonacci poker.
00:21:42
Speaker
And I'm like, is a seven bigger than a nine? And is my time estimation? Like it the whole thing was garbage. So I work with John Kern, who is one of the co-authors of the Agile Manifesto, a legend ah of the field.
00:21:56
Speaker
And I said to him, John, man, like like, what's with the Fibonacci poker? Because it doesn't make any sense to me. And he said to me, Nick, that's nothing but that's not part of the original manifesto. That's the sort of thing that somebody else somewhere has tried to say, hey, we're making our own version of Agile.
00:22:11
Speaker
And they try to own it and to commercialize it to sell it to their clients. And they add in all this other stuff. And it becomes a bit of a snowball. And it's collected a lot of junk over the years. And so I think there's a purer form of agile that we can return to, a simpler form of agile.
00:22:28
Speaker
So I'm for the moment calling Fisher Price agile. And the reason why I'm saying that is because the kids need the agility. They need the agile. And therefore, it needs to be Fisher Price for them. Because i don't mean to be rude, but if you've got a different learning style, but you need to be agile, you're not going to go through all the coaching. You're not going to go through all the mentorship that it takes to develop an agile mindset.
00:22:49
Speaker
Right. Because agility is ultimately a mindset. All the ah ceremonies and rituals and all the stuff that we do, that's secondary, in my opinion. Right. That's just how you execute the ah the edgeit agility.
00:23:01
Speaker
The actual thinking of agility and the decision making, the behavior of agility is all in your head. Right. And it does take a transition. It's not easy because some of it's like jumping out of an airplane and like the first time and you're like, this

Agility as a Mindset

00:23:15
Speaker
feels weird.
00:23:15
Speaker
You know, like walking out of meetings I shouldn't be in or not doing heavy documentation. You know, it feels really weird and wrong. Am I going to get in trouble for this stuff? But once you get past all of that, you start to go faster as a team and you start to respect each other and understanding that I'm i'm starting to understand what I need to do, not really what I want or think I need to do.
00:23:36
Speaker
And that's, that's amazing. If you, ah for a product manager, that would be, that's the productivity gap that we need to fill in. So people can, I can't say that word mess around at home.
00:23:46
Speaker
I did well there. I did so well. Um, people can mess around at home. Um, and whatever the productivity working from home, blah, blah, blah, hybrid stuff is going on.
00:23:57
Speaker
If you want it, we can replace some of that massive loss with ah with agility, with getting back on some kind of Fisher-Price agile user-centric rails. i'm so i'm That's the hill I'm going to die on. That's what I truly believe in.
00:24:11
Speaker
And I believe that's the panacea for our world, for the youngsters, for the oldsters, for kind of everybody. That's how we work nicely together and and ensure our jobs, build the right thing, Keep our higher ups happy.
00:24:23
Speaker
Job done. You know what mean? Move along. So do you think all of this kind of, don't know, I'm probably using the wrong word here, but like regimented approach to agility, you know, with all the cadence and the then the process and all of this stuff, ah I kind of think of it like maybe the intention was I always use 80s references from 80s movies references. Is this like a karate kid thing where Mr. Miyagi is like, okay, paint the fence, right? So the outcome or the goal isn't that Daniel-san becomes like a fence painter. He's supposed to learn how to do an up block and a down block, those sort of things, right? So the...
00:25:00
Speaker
it's this building habit or muscles, you know, muscle memory and habits is that, was that the intent of all of this and that the agility will, will follow when, you know, oh now I know karate just cause I sanded the floor, right? Like, was that the intent? Do you think?
00:25:16
Speaker
It's a good analogy. Cause that's how we learned was the wax on wax off star. Like I don't really understand what I'm doing, but I'm going to do it until it makes sense. Right. Yeah.
00:25:27
Speaker
Um, there's no tolerance or patience for that kind of learning anymore right like they want to know black and white too long didn't read give me bullet points i'm going to ai summarize it all of that stuff and you can't get into the detail see what i'm often on debbie levitt's podcast once a month uh and we do these q and a's and It's remarkable. She's now got a graphic that says, it depends.
00:25:55
Speaker
Because we say it so many times. Like, you got ah you can't just, there isn't a rule book. And one of the my mantras, or one of the things that you would have heard me say at conference and on podcasts, and anyway, I'm allowed to talk, is um there is no playbook.
00:26:09
Speaker
There's no rule book, per se, of UX. Right? You learn it over time to being a practitioner. It's like saying, write me a rule book on how to be a... ah a surgeon, right? It's not, those are, those are learned skills.
00:26:26
Speaker
You can't necessarily put them down in a book and you don't know when to pick up what tool to do what procedure. Same thing with UX, right? But you need to learn these things through experience, ideally mentored experience to know when's it appropriate to do an interview?
00:26:43
Speaker
When's it not appropriate to do an interview? When's it, you know, when do you do the things, how do I create the strategy and use the tooling to get the outcome that my client needs or my, my, my stakeholder needs.
00:26:53
Speaker
That's why they still call it practicing medicine. Even when you, right right you're still learning along the way. You're learning. Didn't all just happen in school.
00:27:05
Speaker
Okay. So you think that there's an experiential element to that, that helps sharpen those skills and, and, that and build that muscle memory over time. Then that's just needs to be there.
00:27:16
Speaker
But bear in mind, right? You and I, are older, right? Well, speak for yourself. Okay, of course. yeah again Yeah, it's fine. It's all good. But what I'm saying is we are older.
00:27:29
Speaker
um We probably sound like our grandparents to the kids, right? When we're talking about Victorian engineering and the quality of materials used to be, you know, it's all plastic that now instead of it being, you know, wood or whatever, high tech.
00:27:45
Speaker
And I think... because of young millennials and Z and alphas are all born into this instant world of just fast, instant there.
00:28:00
Speaker
yeah Everything that we're talking about, there's like a clear line in the sand between generations. You're either kind of Victorian old school thinking, want to do things properly and learn and have a growth mindset, or you're young and you need it all instant. And you guys are all crazy for doing things the slow, dumb old way because we're young and and brilliant and we've got the new tooling, right? Yeah, exactly.
00:28:23
Speaker
And this, yeah, and this is where it's at. And unfortunately, you need both. Yeah. I think. And for I think because of ageism and because millennials are in leadership roles, they've kicked us all out.
00:28:38
Speaker
All I see on LinkedIn are gray haired or bald or whatever UX folk with the green thing saying looking for work because no one wants to look at us for good, well, not for good because they're defending their own positions.
00:28:52
Speaker
yeah It's like, I can't have somebody tell me I'm doing a bad job who's got more experience than me. That would be a disaster because it's all about optics. I have my kids, like my oldest daughter is, she's 26. And so she didn't grow up, you know, first day out of the womb, she got an iPad in her hand, right? So she grew up in a little bit of a difference. She was during that transition period.
00:29:13
Speaker
working with her like through her homework and stuff was a different experience. I could sit down and try to explain concepts and stuff like that. But my younger ah daughter, um yeah daughters, as I, if I'm working through homework, they're like, no, no, just just tell me the answer. Like she does. She kind of has that tendency to be like, no, no, no, I don't want to do the journey with you, dad. Just i' tell me how to do the problem.
00:29:34
Speaker
Tell me the numbers, right. You know what i mean? There's, there's this mentality of like, no, just, just tell me what to do. and get to be done with this thing there's there's and is it a unique thing to where we are in human history right now or is it is it is is that how our grandparents thought about us too i don't know dopamine at the center of it all okay dopamine systems have been irreversibly changed by social media and by the internet but accelerated and completely like catalyzed and concentrated by social media like to the max gone from first gear to sixth gear and overdrive like in a five-year period type thing um
00:30:16
Speaker
As a result of that, the younger generations, and this is all personal opinion. There's no science or evidence behind this. This is just my personal opinion based on years of observation um and dealing with a 12-year-old who yeah is dopamine addicted.
00:30:30
Speaker
Easy dopamine addicted. What we've done is we've changed the reward systems. And you've heard this in Fortune 500. You've heard media, right? And it's in the media right and it's true We've changed it because people now just want easy dopamine.
00:30:42
Speaker
But that means learning, as you're seeing from people like your daughter or anybody else, they don't want the working out part. Right. So that's just give me the answer, because that's what all of the systems and tooling does.
00:30:55
Speaker
Right. There's no effort. And so if it requires effort, I'm not interested. And that's a disaster of my son's schooling and learning. Because the minute he comes up with it as a maths problem and he can't work it out,
00:31:09
Speaker
they shut down. They literally shut down. And if you push from there, you're pushing deeply into mental health issues because it's like, it's almost abusive. I'm asking them to do something that's it's like isn't it It feels inappropriate.
00:31:25
Speaker
Now, this is the other thing I wanted to to remember to say. In that Victorian analogy kind of thing, right we were much more draconian. Mental health wasn't even invented as a word. And we told somebody that you're going to stand there and do that for 12 hours because I'm not going to feed you or whatever. right Something barbaric that was socially acceptable back in the day.
00:31:44
Speaker
right People worked long hours and you know i mean they learned the skills of their trade because they needed to. Nowadays, you don't need to learn the skills. You can just say you you're skilled. You know what I mean? It's all that controlling of the narrative and who you are on social media.
00:32:00
Speaker
don't have to be real. You don't have to be able to do any of the stuff. You just have to be able to say it. There's a ah very hungry audience of people willing to listen who have no filter for authenticity.
00:32:11
Speaker
And if you say here, I've never done user research before, have all my prompts have by my prompts and go and you become a user researcher and people. Great.
00:32:21
Speaker
Yeah. Fabulous. They don't even stop and think if there's any validity of these things are any good. Has this guy done it before? Because it jumps the effort gap. It meets the dopamine need, you know?
00:32:33
Speaker
Yeah. It's a, you're bringing up like an interesting concept and i don't remember what the book was I was reading. i was talking about this, but like the, our our neurochemistry that's built into our brains wasn't designed for the world we live in right now. It was designed for way, way back millennia and millennia, like the evolution and adaptation has not caught up to our world and it never will.
00:32:54
Speaker
Right. We'll, we will be fishes out of water in our world forever and ever because we we the world changes around us so quickly and the the evolutionary systems that are in place in our biology can never ever catch up so we're going to be you know we're always going to feel out of place in our at least neurochemically in our world um so yeah like that dopamine wasn't designed for what it's being used for right now if it was reinforcing different types of behaviors then and that took millennia to to get there right
00:33:23
Speaker
We should talk about AI though, right?

Impact of AI on Effort and Dopamine Systems

00:33:25
Speaker
Because all of this stuff with regards to generational learning, ah dopamine stuff, AI has got a lot to say about all of this, right? Because it it for it leans into the to the effort and, you know, easy dopamine and doing stuff.
00:33:42
Speaker
um think I don't know how you feel about it. You think AI is going to catch on? You think it's going to be a thing, and I don't know. i don't know. Like if at the moment it's a, like it's ah it's a, it's a high overhyped misinformation machine, right?
00:33:59
Speaker
That you can't really use it in anger. It's great for coding stuff. And there's some really great specialized fine tuned applications of it. fabulous good stuff right everything else is trash it's like we've all i don't care who you work for everybody's been smashing ai into anything they can find absolutely right it's like oh like here's my thing let's put it but does ai make it better no let's try again no let's try again and that's where we're at instead of the user-centric view of how can ai be used to improve our meet user needs and and remove pain points
00:34:31
Speaker
Yeah, you do especially product world, all the products are like, oh, AI powered, AI driven, now with AI, know, there's all of this out there. is yeah are Do you have any examples or can you think of like,
00:34:47
Speaker
examples that you've seen where you're like, oh, wow. Okay. Now that's doing it right. Like they did think of like, okay, how can i help my users, you know, benefit their lives while leveraging AI and it's done well. There's a lot of examples of doing it poorly. Do you have any thoughts of where we've done it well?
00:35:05
Speaker
The only example I've seen of cool AI usage is there's ah an app and I can't talk about it because it's internal and it's live and all of that stuff. It's company confidential. But ah one of our most senior engineers, who's also one of the youngest and genius guys, has built this thing.
00:35:19
Speaker
And this thing, It converts one thing into another, but does it by checking and checking itself and internally validating and externally cross-validating the whole way.
00:35:31
Speaker
Right. And it's because because it's a code-based thing. You can do that. It's much harder to do that with natural language and everything. Right. And it's very, very impressive. It's really like science fiction. Very, very cool.
00:35:42
Speaker
ah Anytime I see it, I'm like, I just don't know how you're doing that. It's really amazing. ah But that's pretty much the only example. I mean, I managed to get Gemini 2.5 Pro two nights ago to do some pretty amazing persona work for me ah using the deep research function.
00:36:03
Speaker
um nice And it did what I've been trying to do all week. It created a data-driven persona by validating every single claim that it made in that document.
00:36:15
Speaker
Now, that's amazing. and that was what was breaking me was because it it couldn't do that before. But somebody on LinkedIn said to me, Nick, try it with the deep research thing on a large context model. And it took a while to do it, 15, 20 minutes, I think. yeah But when it came back, I had like 30 pages of OMG.
00:36:34
Speaker
yeah You know, like like this is super impressive. Then again, yesterday, In the morning, Brad, my, when i say partner, my my design partner, my my product team partner, um Brad, the designer said, to her dude, there's there's a support engineer who's been using this thing.
00:36:50
Speaker
um He might have some really cool insights or, you know, let's go talk to him. So I slacked him. He had minutes. I spoke to him there and then, did a quick half an hour thing, recorded it, took the transcript, took the put the transcript through Gemini 2.5 with my two-year-old standard prompt,
00:37:08
Speaker
And it came out with an amazing table of user needs and pain points from that discussion with the support engineer. So I spent the next half an hour watching the video again with the confluence page with all the stuff, with all the AI analysis.
00:37:23
Speaker
Pretty much word for word, absolutely spot on. right i checked it that I checked it front to back. It was a couple of acronyms it got wrong, right? ah Which is especially the acronym There's a programming language that we use in Atlassian world called HAPI, H-A-P-I.
00:37:42
Speaker
But obviously it put it it put down as HAPI, H-A-P-P-I. sure So that's a completely reasonable and expected natural language interpretation or parsing mistake. but but But semantically, 100% nailed it.
00:37:55
Speaker
And I cannot say that on any other AI service to date except for last night. And that gives me massive hope. Right. I think this has been like, again, a massive early stage clinical trial or like very like an alpha, a public alpha.

Critique of AI as Misinformation

00:38:11
Speaker
Right. Where we're helping to train the model, helping to to do this. But until now, the past two years have been very overhyped crap. Right. But everybody's like, wow, this is amazing.
00:38:24
Speaker
So they're not really challenging it or considering it as much as they really should do. You see? Oh, yeah. Yeah. Well, it's convinced it's very convincingly wrong.
00:38:36
Speaker
That's the problem. yeah It's hard to tell because it's very sure of itself when it is hallucinating, right? It's like a Dunning-Kruger, you know what I mean? Yeah. It's like a thing as Dunning-Kruger. And then it's really sure of itself when it's absolutely wrong.
00:38:52
Speaker
And it just makes things up. and it But it sounds very convincing. So it's hard, right? It's very difficult. And you have to like kind of check it and and make sure, like, are you sure about that assertion? Or where did you get that statistic? Oh, I made it up.
00:39:05
Speaker
Like, it does that. And yeah yeah you're right to ask me about that because I completely made it up. It was a placeholder. Yeah, but you didn't really say it was a placeholder, did you? Yeah, yeah, yeah. And so because I'm, for personal reasons and whatever reasons, I'm just,
00:39:20
Speaker
ah like a laser like validity guy. I just, I can't help it. I've had an unfortunate like life where I've been conned or lied to and I've got trust issues and other things. Right. So I'm, that's why I'm the scientist that I am.
00:39:32
Speaker
And so I asked these questions. I will challenge the AI and I will ask, I'll go that extra 10 miles, not just the extra mile, the extra 10 miles to double check and to validate because I'm not prepared to grow in the wrong direction or to kind of eat,
00:39:48
Speaker
wrought rubbish You know what I mean? Consume misinformation. sure And it's like the misinformation is it so, it's just literally assume everything is wrong by default and look for the signal amongst the noise because that's where we're at now.
00:40:04
Speaker
AI has given a whole, let's call it a lazy generation, ah misinformation generating machine. And we haven't got any control mechanisms or processes or leadership to stop any of this stuff.
00:40:17
Speaker
Yeah, it it takes, I think, and I just, I just wrote an article about how to, how to kind of um use AI when you're generating content and writing to maintain your authentic voice, you know? And so there's a little, there's patience involved. You you can't just say, oh write me this and then just copy and paste it and be and be okay with it.
00:40:35
Speaker
Number one, it'll be wrong a lot, but number two, it's not your voice, right? So it does take a little time. So you're right. Like for people who, are impatient where they're they're built to be impatient their entire world says be impatient i'm just going to give you everything here in your face as as soon as you ask for it ai doing that with incorrect information is kind of a recipe for disaster so so it is it's going to take like discipline patience to work with ai because you can't get good information and good good stuff out of ai and generative ai but it's going to take discipline
00:41:07
Speaker
Definitely. One of the things I like about GPT is the way that it' it has the optimal level of verbosity or brief brevity, right? And it's got colorful emojis and stuff to break it up. It presents visually information in a more consumable way.
00:41:22
Speaker
When you look at Claude or Grok, Grok in particular has got a line of code in it that says... make us lots of money by spending tokens by having this explosive output, right? You ask Grok a simple question and it's like, and you're like, wait, man, I just just wanted to know, like, what's this in millimeters or something, right?
00:41:42
Speaker
um And I've got three-page essay. And it's like, really, I didn't actually need that. um Now, if you're a dopamine young dopamine person, Like you ain't reading all that and I'm scanning it at best and I'm a good scanner and I can't be bothered. Right. So you've got to be able to consume the output. Like if you can't be bothered to the effort, the effort is in reading what it's put out, which I'm telling you, most people don't, they just can't be bothered. And I'm guilty of this too.
00:42:06
Speaker
Sure. Right. we've We've broken all of our effort systems. And if mine are broken, then I fear for everybody else's. Let's just say that. Yeah, that's what i'm saying. Like, it's going to take, you're going to have to be disciplined to, to not, and it's, it, as you're saying, it's very easy to look at him and be like, wow, that's, that's really convincing stuff. Let me just copy and paste that into my document or whatever, whatever you're doing with the information and just take it as is no validity checking, no, no cross-checking, anything like that.
00:42:33
Speaker
So it's going to take disciplined patience and in in fighting all of the urges that are that are in our world right now to to really use this stuff well, right at least right now, for sure, because you're right. It's she's wrong and all that.
00:42:49
Speaker
Do you think there's like a proofreader role coming up? It's like, I can't be bothered to check the i AI output, so I'm going to pay somebody to do that for me.
00:42:59
Speaker
That could be, that's that's the thing we're going to struggle with as ah humanity,

AI Compared to the Industrial Revolution

00:43:05
Speaker
right? I mean, I do think that it's it it is that big. AI's impact on on the world is going to be that big that we we are going to be rethinking what do we what do we value? what are we going to What do we want to pay people to do anymore? like Because it's eliminating a lot of jobs that right now people are paid to do things. So we have to think about, and but this happened before with the Industrial Revolution. Like we used to pay people to,
00:43:28
Speaker
take a chisel in their hand and go, go beat on a mountain and and cut a tunnel where now we don't do it that way anymore. Right. ah They have to have other jobs. Well, we've been kind of like technology, technology ying our way into obsolescence as human the human race. Right. Like we're this, this was our last bastion was thought work. Right. And now it's like flu move computers do pretty well too.
00:43:52
Speaker
Do we just all lay around and be fat, dumb and happy and let the computers do everything? I don't know. I think that's the career. Like if you take a really dystopian neuromancer type view of the future, rather than, you know, neuromancer, Blade Runner, like not Star Trek, right?
00:44:10
Speaker
Like the the not Star Trek version. um We're all, there's a character in neuromancer called, I think it was called the whale. And he's this guy yeah floating in a tank of fluid with um like a respirator and other tubes coming in and out of him to take care of him. Right. So you can live effectively like a fish in this tank.
00:44:32
Speaker
So you consume ah content all the time. Right. Without having to stop and and do human stuff. Right. Um, and I can't, that's a really extreme, extreme dystopian view of the world, but that's kind of the job for our children in the next 50 years, because if they haven't got the skills to do stuff and the machines are doing everything for us, but yet you still need an economy and people need stuff to do during the day.
00:44:58
Speaker
Well, it's the advertising model that you pay people to sit there and watch adverts and be consumers. I mean, I can't, it's really scary. I don't, ah you know, the other thing is, is that but what the young people accept as okay, it just even in UX terms, right, is outrageous.
00:45:20
Speaker
Like they've been abused so badly that they've come to accept bad UX as normal and that's okay. And when old people like me come along and that's awful. They go, no, it's fine. And you're like, oh, you know, and it's like, it's not fine, but you just, you've been kind of, uh, what's the word gaslit into it almost, you know?
00:45:39
Speaker
Yeah. Um, Yeah, I mean, so yeah users, and and it's funny how yeah my experience has been, and you know i'm not ah I'm not a UX person, but whenever we i do get the opportunity to spend time with users, um you know you look at like what how they're using the the things you've built, and you're like...
00:45:57
Speaker
why are you doing that? you know, like i could just, I could put together like a multi-checkbox solution here and you could just check all those things and say complete. and They were doing like 10 minutes to go and complete things individually, something like that. And they're like, well, that's just <unk>s just how we do things. It's funny how they can talk themselves into things that where I would look at it, I'd be like, you know, I'm a lazy developer.
00:46:17
Speaker
I look at it and I'm like, I'm not doing that. i'm a I'm going to put some multi-checkboxes here and and solve this problem once and for all. But they're just like, nope, this is my world. this is this is what I do every day. And it it is, you're right. It's funny what you can talk yourself into.
00:46:30
Speaker
I just came back from a place called Topgolf. You probably got it in America, like a driving driving range with a game, right? yeah um It's hot half term here. Took my son there. Long story short, we hadn't been there since the pandemic. So for all intents and purposes, it was my first time, right?
00:46:43
Speaker
none My 12-year-old and 12-year-old. So I've got a 12-year-old who is completely sharp and super online and super digital, right? Between the pair of us, both ends of the experience and the age spectrum, neither of us could work out how to do anything at this place.
00:46:56
Speaker
So I had to go ask for help. Yeah. Yeah. And how do you start it? How do you had you onboard? All of this. um Had to ask the guy. And he said, don't worry. We have this problem with everybody. Once he said, it's really easy to know once you once you know it, but first time is really hard. i was like, I wanted to say, could you just rerecord that for me so I can put this on LinkedIn?
00:47:16
Speaker
Yeah. and yeah it's it's UX isn't just... digital checkout e-commerce stuff you know it applies in the physical world in footfall in service design customer service all of it um it's just that the people are so used to a lack of ux or a lack of of designing for people for me or people like me yeah absolutely i that's yeah that's kind of like that question earlier about like did we do ux before with physical things and i and i think you're right i mean if there's there's there's more to experiencing products than just a digital, you know, an LCD screen, right?
00:47:52
Speaker
Yeah, totally. You know what I've just thought of something, um, before UX. Okay.

Historical Methods in UX Research

00:47:58
Speaker
My first five to eight years of my UX life from 2008 onwards was spent in UX labs or usability labs, right? That's before the pandemic, UX labs were, that's where we did the more, the work.
00:48:12
Speaker
right There wasn't all this discovery stuff. It didn't really exist. We didn't do all this market research rubbish. We just went into the lab with people and got them to do a thing. Now, most of these labs we went into were also known as ah observation lounges or consumer studios because um Unilever had one and some of these big, big, big multinational... Like P&G here in Cincinnati would have that kind of thing. Yeah, yeah.
00:48:38
Speaker
Absolutely. In fact, the P&G was one of them I think I was in. um because they became like reappropriated or being multi-use for UX folk as well as the consumer teams, right? Because that's where you've got a room that looks like it's, I mean, let's say, I'm in Glaxo. They had one at Glaxo when I was there, Glaxo Smith client, the big pharmaceutical.
00:49:00
Speaker
ah They had one room that looked like an Ikea room, like somebody's front living room, right? It wasn't a corporate cubicle-y type thing. And it looked exactly like you've walked in somebody's home. And that was the consumer lab, where people would come in and open the toothpaste or whatever, you know what I mean? yeah And this really this stuff is really, really important, especially when you're doing clinical or medical stuff, right? I need to know that a kid can't open it.
00:49:23
Speaker
I need to know that old person with arthritis can open their meds. I need to know this stuff and I need to watch it and I need to see it firsthand in numbers so that I can iterate and design it so that it does work.
00:49:34
Speaker
um sorry So, it yes, it has existed, but it's been in that kind of specialist-y, it's kind of marketing area, if I'm brutally honest, because it'ss it's a form of market research when when it's consumer research like that.
00:49:48
Speaker
But ultimately, you've got to actually watch and interact with the people who are going to consume the thing that you're building, and you need to be able to observe them. Don't just ask them. Asking is okay, but you also need to, I think you you absolutely need to be able to observe them interacting with the thing. That's why you build the prototypes.
00:50:06
Speaker
The takeaway for the audience listening is that what people say and what people do are generally never the same thing. Right? So, again, case in point, at the end of my lab session I just talked talked about, I'd say, okay, so you've just bought the red shoes. How did you find that? And they'll go, yeah, it was great. I loved it. It's a really good website. love the colors. It was a really good experience. I'd i use this and I'd tell all of my friends to use it, right?
00:50:28
Speaker
But I've just watched them failing to buy the red shoes for 20 minutes. and so Tumbling around through the site, going all over the place. Yeah, yeah, yeah, yeah. yeah yeah not Not like in your in your head going, no, no, no, it's right in front of you. But they weren't able to because it's our design problem.
00:50:40
Speaker
But that they're trying to acquiesce. it's it's ah It's a known problem. It's a bias. They want to be a good test participant for me, the the investigator. And that's a big bias. I don't want you to tell me what I want to hear. I want to i want to know what's actually going on.
00:50:53
Speaker
That's why I'm a behaviorist, because behavior is, in my opinion, a lot more reliable and valid than self-reported stuff, which tends to be aspirational and biased and all the other stuff.

Biases in User Feedback

00:51:06
Speaker
And if you can't say anything nice, don't say anything at all. That's where we got here, right? All our parents telling us that for years. That's what got us here. All right. So our next segment of the podcast, we have this little thing we do. We call it Ship It or Skip It.
00:51:21
Speaker
Ship or Skip. Ship or Skip. Everybody, we got to tell us if you ship or skip. this The idea here is we're going to, you know, kind of throw out a topic, trend, or something you know some, know, some sort of topical information or idea, and and we we, you know, opine on that together. And then, know, I'll ask you your opinion on things. you know Is this something, yes, we want to do it, that's ship it, or the no get away from this, this, that's a skip it, right? So that's the idea.
00:51:44
Speaker
Kind of hot or not type of thing. Okay, so the first one... um What about using synthetic users? So artificially and intelligent, you know, generated ah users for for doing testing of of your products. What do you think about that?
00:52:01
Speaker
So that's what, ship it or or skip it? Ship it or skip Take it outside, beat it up, shoot it, chop it up into little little pieces, and then spray it across the ocean so that they can never, never, never work again.
00:52:13
Speaker
So that's not even a, it's, sorry, that's probably bit harsh. and Sorry. It's, Anytime anybody says synthetic users, like I really, really triggered, right?
00:52:24
Speaker
Anybody listening who doesn't know what synthetic users is, synthetic users is an AI company that's been created to create AI participants. So you don't need to recruit anybody. I can just ask an artificial construct construct, you know, and interview them like I'm doing recruitment.
00:52:36
Speaker
But as you can see, with the way AI is right now and probably for the foreseeable future, that that's never going to work. You need actual real user needs and real pain points in the actual context, right? I just said what people say and what they do are different things.
00:52:51
Speaker
And AI saying something is is off the scale. So so quite simply, no, ah forget it. I can't see any good use case for that coming up. I really can't. And I think it'll cause you more harm than good.
00:53:02
Speaker
And in time, I suppose it will show itself to be true. If people invest their ROI, in something that is based around synthetic user insight, they're almost certainly going to build the wrong thing that has a negative or no ah ROI, which will cause quote should get questions asked.
00:53:20
Speaker
Yeah, i I tend to, you know, I would agree. I think just you're just asking for trouble. ah So i would i would I would say skip it. ah You know, you're you're if you're using synthetic users, you're you're going to get synthetic return. You're going to get synthetic delight.
00:53:38
Speaker
your but your products aren't going to be used by synthetics, right? Like this isn't um Blade Runner. This isn't, we we aren't synths, right? Or we're what they called them. Replicants.
00:53:50
Speaker
Replicants, yes, the the the synthetic folks. like we We are not replicants, right? You're not building products for them. You're building them for real human beings, and they need to be able to use your product. So why would you ask a synthetic entity you about it?
00:54:02
Speaker
I love the way you said that, though. It's like, if you're building for synthetic audience, then maybe this could work. yeah But no one's building for synthetic audience. Yeah. that that That minute would be interesting maybe, but we're not.
00:54:14
Speaker
Not yet at least, right? Yeah. and yeah yeah na So what about ah maybe on the other side of this? So so instead of saying, okay, i have synthetic users that I'm asking, do you like my product?
00:54:25
Speaker
What about on the other side of the equation, the thing doing the research and and interacting with users and getting information out of them? Have we been able to to do that well?
00:54:38
Speaker
So this is interesting because I'm really conflicted on this particular one. I think this is both a ship it and skip it depending upon, right? But nothing about throwing it in the ocean or anything yet? No, we're not doing anything barbaric because synthetic users is is in a class of its own in terms of stupidity and and over-marketing.
00:54:55
Speaker
So like that's why was particularly rude about that one. But that's a a special case. You triggered me. You said the magic words. So the Luddite in me wants to go, no, of course it can't do a good job. But the the realist in me and the the realist who's been around lots of kind of big enterprise gigs too knows that scale is a problem.
00:55:11
Speaker
Scale is a massive problem. And anybody that doesn't know it isn't experienced in that. So you can get all kinds of hypotheses from small sample work. But if you're trying to do big work for a big enterprise company, you need a big sample because you've got different you ge geographies, different locations, right? There's cultural differences, language difference.
00:55:30
Speaker
There's so many differences that that generally sometimes you really have to do do that, which means you generate your hypotheses and you get your knowledge together locally and then test them at scale to see if they work in different areas.
00:55:42
Speaker
Well, Me, I can't be in all those places. I can't do that. And I can't train up a team to be as good as me or, you know, or that I would trust. Right. So there is definitely a case for having a AI avatar based service.
00:56:01
Speaker
Now, I need to contextualize this because I'm sounding like a madman. Right. This is for short poll based type questions.
00:56:11
Speaker
Right. so hi, I'm Bob, the automated avatar. Let me ask you your three biggest pain points about using product X, right? Okay. Nothing much more than that. It's not replacing a human interviewer.
00:56:24
Speaker
It's just an easier, low effort way to collect information at scale or opinions at scale, insight at scale, right? Gotcha. If people want to replace people like me with those things, yes, it's going to go horribly wrong. It's going to go bad, right?
00:56:39
Speaker
There's got to be biases around, look, we've We talk about the Heisenberg principle, also known as the big brother principle, i.e. the minute you are being watched or you stick a camera on you or, you know, there's some eyes on the wall that are looking at you, your behavior changes.
00:56:55
Speaker
right Right. Right. That's a fundamental bias of of of our world. to qualify what you're saying as ship it or skip it, the ship, it would be, I just need to get like kind of answers to this, these questions. And and then it's a script. It's a, you know, so you can have the max headroom sit there and and ask somebody a question. So it maybe looks like a human being, but um glitches out every now and then.
00:57:19
Speaker
um But like the, The other style of like where you're you're observing users and watching them interact with your product or or your prototypes, that, no, I got you.
00:57:32
Speaker
i I can't see that behavioral kind of research or analysis happening in our lifetimes. I could be really wrong and it might take 10, 15 years, right?
00:57:45
Speaker
That's very, unlike I mean, look at what we've got right now. right? it's It's not getting better at speed. I think we've got some fundamental problems with with it. So moving forwards, I can't see how it's going to interpret or parse or understand behavior. It's so nuanced. It's so subtle.
00:58:06
Speaker
I... like I really can't see it happening ever, but I could be wrong on that. I really could be wrong. I mean, technology does amazing things, but not in any kind of reasonable service lifetime.
00:58:19
Speaker
Gotcha. Yeah. I mean, like, I can't help but think of like, uh, you need hear these, these stories about like, uh, using AI to look at like x-rays for instance, and they're able to pick up on subtle little things. That's like, Oh, this was a signal. There's a, there's a brain tumor forming on this, this m MRI or whatever. Right. And so the imaging, but I think it's,
00:58:42
Speaker
Particularly adept at that's image processing, right? Those are those are subtle little things and and and maybe the human eye doesn't have the the the level of granularity, so to speak, to to pick up on those things where a computer might be able to see that differently. It experiences the world differently than we do.
00:58:58
Speaker
Yeah, totally. And what we're talking about are like ah a specifically fine tuned modeled that's all based around machine learning for a very specific task. Yeah, great. That stuff's brilliant. That's what we should be doing. But this idea that we've got the like GPT, Claude, etc. This open book of here's a prompt, like a terminal window. Off you go with no boundaries or definitions or um use a manual or anything.
00:59:21
Speaker
It's just here's a thing. We're going to change it in the background. it might be wrong. Off you go. It seems really irresponsible to be using that in a live fire exercise called work. Exactly.
00:59:33
Speaker
right, so I'd say we'd both skip it on that one if I had, yeah or or should ship it in a contextually. Skip it on the synthetic users, ship it with some some context around where to ship it with the second one.
00:59:45
Speaker
What about like, you know, and our worlds are are intertwined, you know, you you being in the in the user experience world and and me being a software engineer, but you hear about this, like my my world is like that vibe coding. Is there an analog in the and the user experience and and kind of the design world of like,
01:00:02
Speaker
You know, just just kind of interacting with an AI to kind of come up with a product. is that ah Is that a thing for you all too? Well, no, but I put out a parody a couple of weeks ago called Vibe Research just to send it all up, right? Because it the obviously obviously you can't do that. you can Research has got more structure to it and validity and all kinds of good stuff.
01:00:19
Speaker
um But the whole Vibe coding thing, i mean, I'm going to slightly not answer your question. Because the vibe bit is like, you know, I'm just going to go to go with the flow, man, you know, just follow. And that's cool, right? I've got nothing wrong with with with creative exploration, right? It's a great thing. It's a brilliant thing that there isn't probably enough of that, right?
01:00:42
Speaker
But vibe coding or, you know, even Figma auto layouts and all the AI supported stuff that's doing designy type work, it's actually brilliant, right? So that's the part of the AI stuff that can help us test and iterate faster.
01:00:59
Speaker
So if your product manager or your designer or whoever it is can iterate, remember UX is research into design. So if you can get the insight into your product faster, then that's going to build your better product faster.
01:01:15
Speaker
It sounds like you're saying like it's use it as an accelerator and not a replacement. So you can do more iterations more quickly. yeah Yeah, but the point is, is that historically Azure,
01:01:28
Speaker
has been the prototyping tool, but it's a bit of a pig to learn. UX isn't great and designers don't know it. So getting hold of UX resource to get an interactive prototype to test has been hard.
01:01:38
Speaker
Whereas it used to be ubiquitous. Everybody had it. You didn't work in industry. That's like being a designer without Photoshop or equipment back in the day. Um, And things like Heroku and other prototyping kits were okay.
01:01:51
Speaker
um But it's really about getting something testable that's that either disposable or lightweight or just rapid. ah And all this AI design tooling is helping to achieve that.
01:02:04
Speaker
So you can vibe code together, something in grotty HTML, something really quick, and then test it. That's fabulous, right? The reason why we started off with Balsamic and we did stuff in Azure is because it's low fidelity.
01:02:17
Speaker
There's no typography. There's no textures. There's no palette, right? It's just interaction design on a pure level. And if you can get to that really fast, or even doing hi-fi, i don't i don't really I'm not really that bothered, really.
01:02:30
Speaker
But if you give me something to test that's truly interactive, that represents the real thing so I can see actual behavior, superb. I'm should be more agnostic about what you give me.
01:02:42
Speaker
All I care about is that it actually, I can see behavior with it. When you give me a limited image mapped Figma prototype, I can't see real behavior because I have to ask somebody, what would you do if you clicked on that thing?
01:02:56
Speaker
And that's self-reported. That's not observed. Right. Right. So it's what, and what they would do is not what they will do. Definitely. So I guess, ah so with it being an accelerator, I know um one of my coworkers, producer Ryan, ah ah he, he talked about like stay ugly. I can't think that's a phrase if I'm not butchering that stay ugly as long as possible.
01:03:22
Speaker
And the reason we had to stay ugly is because it was almost like cost prohibitive to, to get to the pretty phase, right? It it took too long or, or it costs too much money to do that. But now you're kind of,
01:03:33
Speaker
accelerating the timeline where you can get to the pretty phase using AI. And so you don't have to stay ugly quite as long, so to speak. Maybe is that kind of the idea where you're talking about? Yes.
01:03:46
Speaker
Unfortunately, because of that influx of the UX UI crew, all those art and creative directors and Figma became the tool of UX choice and of that. It all became visual stuff.
01:03:56
Speaker
But really, imagine building a house and as you're building it, putting the bricks up, somebody's painting the walls. You're like, wait, man, this is probably going to change.
01:04:07
Speaker
You know, the the architect's going to punch a hole through that. What's the point in painting it? Right. You know, and actually it's not the paint in this. It's all a lovely aesthetic effect, but actually, can I walk through this thing? Can I find my way to the rooms?
01:04:19
Speaker
Can I get upstairs and down? You know what i mean? That's, that's the important stuff. And it's, that's the DNA of your experience. Right. everything else on top of it, it app they absolutely moderates it. I'm not denying that.
01:04:31
Speaker
But it's not the core DNA. It can't exist without it. I can live with an ugly website that does my banking, but I can't live with a pretty one that doesn't allow me to check my balance.
01:04:41
Speaker
Right. It's the utility. Yeah, you the usability, the utility of the tool, the thing that's most important, not the aesthetics, so to speak. Aesthetics can be changed. Yeah, and user needs...
01:04:55
Speaker
are central to UX, right? If you're not talking about user needs and pain points, you're not doing UX. It's a really clear, easy way of of testing, the water like an acid test.
01:05:08
Speaker
um So that black and white, stripped back, low-fidelity, ugly version that you guys are talking about should be wall-to-wall user needs baked in.
01:05:19
Speaker
right right and execute in a beautiful, usable, intuitive way. Then, when you pretty it up, it's you're just making it better and lovelier. right You're enhancing things by color coding them or giving visual depth or visual priority or all of those things that that other layer does. And it's great, and I'm not disputing it, but it works to serve the layer underneath it.
01:05:42
Speaker
not It isn't the layer that does the work. All right, like that. That's what love about this podcast. because and I learn a lot on this podcast. This is fun. This is cool. All right. So we have a, uh, uh, our next segment of the of the show we, we refer to as our lightning round. And now this is the most important segment of the show.
01:06:04
Speaker
um this is where we really, you know, throw all the other things out. These are just, these are filler topics. This is the meat of the show. We're really getting into the hard hitting questions. This is, this is the, this is really where we get down to brass tacks, so to speak. Right.
01:06:25
Speaker
It's time for the lightning round. Rapid fire, don't slow down. Hands up quick and make it count. In this game, there's no way out.
01:06:37
Speaker
It's time for the lightning round. So we're going to, I'll just fire these things at you. And I kind of want to hear just, just quick answers and, and, and, you know, what, what, you know what, what's really on your mind, right? this These are the things that are really going to impact our, our audience the most, right?
01:06:53
Speaker
So sourdough or wheat? Sourdough. Okay. How many hours of sleep do you need? Oh God, what I need and what I get.
01:07:06
Speaker
I get about five and a half to six, but I need about seven. Five and a to six. That's, that's respectful. Now, i I don't usually use this one because, you know, kind of people, but you may be particularly adept at this question.
01:07:23
Speaker
Can you say good day, mate, in an Australian accent? Not without an Australian coming for me with a bat. um I don't do accents well. um I probably shouldn't do that.
01:07:39
Speaker
ah What is your favorite car? Oh, wow. I'm not really much of car person, but I'd love... My first car was the original shaped Mini, the British Leyland Mini, if you can remember that, the classic Mini.
01:07:51
Speaker
Like, think of the Italian job. yeah That is probably not my favorite car today, but it's a classic car that I love, and um go it's got it close to my heart. and did Did you have to beef up the suspension in and throw out all the seats so you could carry all your gold like they did in that movie? were No, no, no. But 16-year-old me drove it into a โ€“ I wrote it off. concertinaed the roof.
01:08:13
Speaker
It was like my first and last accident. I've never had an accident since, so it obviously taught me something. Concertinaed the roof, so you literally peeled the roof off the thing. And that's pretty low. What did you go under?
01:08:26
Speaker
No, I tried to take ah a rising hill that goes down in the wet at two in the morning, coming back to my girlfriend's house at about 60 miles an hour and a 30 mile an hour road. And I basically went sideways, mounted the pavement and bounced off somebody's, the front wall of their front garden and bounced off of it at 90 degrees.
01:08:45
Speaker
And I found my glasses on the back shelf. The police couldn't even believe that I walked away from this thing, let alone wow didn't even go to a hospital or anything. It was crazy. I was a lucky boy. Oh my goodness.
01:08:56
Speaker
That's crazy. All right. That, that is, that's the answer we were looking for, for that question, by the way. That was, that's, that is the acceptable answer. Climb a mountain or jump from a plane. I think maybe I know the answer. Climb a mountain.
01:09:08
Speaker
Yeah. Climb a mountain. Cause nothing fills me with more fear than jumping out of a plane. I'm not, I'm not big on that stuff. Yeah. Have you ever done it? No, i'm a scu i'm I'm a retired scuba diver. So I've got 160 odd log dives with a maximum working depth at about of about 60 odd meters, which is deep.
01:09:26
Speaker
um So I'm more of a go deep kind of guy rather than go high kind of guy. One of our canned questions, in you and this will be a gimme for you then, is what does the acronym SCUBA stand for?
01:09:38
Speaker
Self-contained underwater breathing apparatus. Yes. yes Polka dots or stripes? can't really give you, like, that they're both, they both have their benefits.
01:09:50
Speaker
I was going to say polka dots just for now, but stripes are cool too. So, hmm, yeah, can't really answer that one. Sorry. It's tough to have a real hard opinion about that one. I mean, you you've got to be really convicted if there if there is one. You're going to have a strong opinion if there is an opinion to be had.
01:10:05
Speaker
ah Let's just say it depends on the application. If it's a shirt that I'm wearing, it's stripes. I'm not wearing polka dots, right? If it's wallpaper, it's probably polka dots, not stripes. i I know not actually. No, I'm talking garbage. I don't know.
01:10:21
Speaker
All right. What size bed do you prefer? ah God, the biggest one possible. Um, also cause my wife is like a radiator in this. If I can't get far enough away from her, I melt. Um,
01:10:33
Speaker
Yeah, big. like But we've got like a queen size at home. only but yeah like If I could have a California king at home, I would. What's your favorite martial art? That's a really good question. I should say MMA because I'm a long-term UFC guy.
01:10:49
Speaker
But actually, personally, Tai Chi. um because i studied Taekwondo for a few years, but I studied Tai Chi for a year when I was an undergrad doing my psychology degree.
01:11:01
Speaker
And I had a particularly amazing teacher who did particularly amazing things. And he was like a Mr. Miyagi kind of guy, but but just a young... Yorkshireman rather than an old kind of ah Asian chap gentleman. It was the meditative meditative side and the kind of the spiritual or mental side that was as much as the kind of the physical, they call them dances, but those kind of kata, those movements that you do.
01:11:21
Speaker
um And what's also super cool is if you watch Jackie Chan, or and I saw the new Karate Kid film yesterday with my son, which is great, but too short. um it's all A lot of it's Tai Chi just sped up.
01:11:33
Speaker
You know, yeah it really is wax on, wax off. You know, it really, really is.

Conclusion

01:11:37
Speaker
Absolutely. All right. Well, that does it for our lightning round. ah the You passed.
01:11:44
Speaker
ah you You scored very well. Thanks, guys. You did you did a great job. we we um A plus. Good job.

Future of UX in the Age of AI

01:11:51
Speaker
Any closing remarks for, you know, you want to share with folks so we're getting as we wrap up here?
01:11:59
Speaker
Yeah, I would say, look, for anyone who's in UX or research and design, you know, product, um we have to stay up to date with this UX, with this AI stuff, right? Because it's the biggest thing since social networks, since internet, right? Since the industrial revolution.
01:12:14
Speaker
It's a massive, big, significant part of history. But it might not go well and it might not turn out as expected. expected So keep your human skills up. At the same time, don't expect that this AI thing is going to turn out well anytime soon. Assume that it isn't and it's going to be an ongoing work in progress and that some areas are going to get a lot better and more specialized and some areas won't.
01:12:36
Speaker
But at the same time, we've still got to build the right thing. We've still got to make money for our brands. We've got to wash our own faces. you know what I mean? We we want job security. We want to add value. And that value isn't only going to be in wrangling AI or having a set of AIs that that that obey you.
01:12:53
Speaker
Keep your human skills up. And remember, my ultimate value as a practitioner is in my ability to find the gold. I'm like a sniffer dog for valuable insight.
01:13:07
Speaker
Okay? That's all I am. And that's all you not you know, I just don't want to overplay it. You don't need a PhD or any grandiose crap. You just need to be able to talk to people or observe them in numbers and pick the signal from the noise, report it back to your product manager, iterate, iterate, iterate.
01:13:24
Speaker
That's literally the way to do things. It's not difficult and everyone can do it. And if you can get AI to help you go faster, great. It's fantastic. Anything ah to promote or anything like that?
01:13:37
Speaker
if you like this sort of stuff, I'm a noisy guy on LinkedIn. I've been noisy for a long time. I'm probably notorious. um And in which case, please do follow me on LinkedIn. I've got about 30 odd thousand followers.
01:13:49
Speaker
I'm not in this for the fame and you'll get that very quickly from my feed. I'm not a people pleaser. So I'm here to just kind of educate and share my opinion of the world. But it tends to align with lots of other people's and they, I, I think they think that I'm speaking for them. So that's a good thing. And I feel, you know, I feel good about it. So follow me and join the party.
01:14:09
Speaker
Awesome. That's fantastic.

Closing Remarks and Appreciation

01:14:11
Speaker
Well, this has been, this has been great. As I said earlier, like I, I, I learned so much, especially when I invite folks that aren't, you know, in the software engineering, like nuts and bolts world, right. I, I, I learn a lot. So I've, I've, I've,
01:14:24
Speaker
You've edified my brain. I appreciate that. then This has been fantastic. um And I've already started following you on LinkedIn. So I'm really excited to tune in to some of the the content that youre you're pushing out on LinkedIn for sure.
01:14:38
Speaker
Thank you. Thank you so much for coming on to the podcast today. I really appreciate it. Well, thank you both to you and ah to to you, James and Ryan, um because Ryan and I have a lot of interaction on LinkedIn.
01:14:50
Speaker
Ryan and I are cut from very much from the same cloth. um You know, we're user centric advocates and there aren't enough of us in the world. So follow Ryan, follow me and let's make the world a more user centric place. And it just makes it will mean that we have greater job security, less mental health issues, all good stuff.
01:15:09
Speaker
Yeah. Yeah, that's awesome. If you'd like to get in touch with us, drop us a line at the forward slash at Caliberty.com. See you next time. The forward slash podcast is created by Caliberty. Our director is Dylan Quartz, producer Ryan Wilson, with editing by John Corey and Jeremy Brown.
01:15:26
Speaker
Marketing support comes from Taylor Blessing. I'm your host, James Carmen, and thank you for listening.