Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
From Stories to Strategy: Research that Moves the Needle image

From Stories to Strategy: Research that Moves the Needle

S1 E1 ยท The Qual Point of View
Avatar
45 Plays6 months ago

Welcome to the first episode of The Qual Point of View, a limited podcast series featuring expert voices from the field using qualitative insights for real-time decision-making.

In this first episode, Ooloi Labs co-founder Akshay Roongta sits down with Kevin Shane and Anna Lawton from Noodle Research + Strategy to reflect on what happens when you really listen to the communities you're designing for, and how those conversations guide thoughtful products and strategies.

Explore the complexities of qualitative research: navigating survey fatigue, breaking down data silos, the role of AI in insight generation, and honoring the responsibility that comes with asking people to share their experiences.

Whether you're a researcher, strategist, or business leader, this episode invites you to slow down, listen, and consider the human side of data.

Transcript

Introduction and Background

00:00:01
Speaker
Welcome everyone, and thanks for tuning in to the QUAL Point of View, a podcast series by Uloy Labs. I'm Akshay, one of the co-founders of Uloy Labs, and my background is in design research and strategy, where over the last 16 years, I've worked in the social impact space across various domains.
00:00:20
Speaker
Today, we're talking about the power of data in informing decisions, but not the kind you'll be familiar with. We're diving into the world of qualitative data. When we think of data, we often prioritize numbers.
00:00:34
Speaker
Qualitative research brings something just as crucial, the why behind the what. It captures human stories, experiences, and nuances that go far beyond numbers, helping us truly understand the issues we're trying to solve.

Qualitative Data Power and Noodle Research

00:00:51
Speaker
On today's episode, I'm joined by Anna Lawton and Kevin Shane, both are partners and co-founders of Noodle Research and Strategy. Anna is a trained social anthropologist and applied ethnographer.
00:01:05
Speaker
She has an extensive experience tackling challenging problems both nationally and internationally. She is adept at applying innovative research methods to uncover deep human insights and possesses a strong background in human-centered design, user experience research and market research, enabling her to bridge academic rigour with practical application.
00:01:29
Speaker
Kevin specializes in strategic foresight, design, and qualitative research, co-creating innovative, human-centered solutions. He unearths deep user insights, translating complex experiences into actionable strategies that improve products, services, systems, and processes.
00:01:49
Speaker
Kevin and I go way back from when we worked at Quicksand in 2012 and 2013, and I'm so glad that both of them are joining us today to share their experience and insight from years of practice across various domains.
00:02:03
Speaker
Let's jump in as I ask them a few questions about their work.
00:02:10
Speaker
Thank you so much for joining us for this. A lot has happened in the last three or four months. but I've been very glad to see all have been posting a lot more about your process and some of the things that you've been doing. And I was reading one of your recent blog posts ah about...
00:02:27
Speaker
the poster that you all set up at the National Contraceptive Meeting, especially of presenting point of view around male contraceptives. And what struck me about the poster and your blog post was it was really setting up the power of qualitative insight and data in a very foundational way. So I think it was really nice that you broke it down like that.
00:02:53
Speaker
But I just want to hear a little bit about what was that setting like? Why did you feel the need to set up that sort of what feels like really foundational basic, like talk to me a little bit about that.
00:03:05
Speaker
Why is that important? And what has that experience been like trying to bring these sorts of processes and methods into spaces which are usually research and development and more medical health solutions, that kind of things?

Challenges in Male Contraceptives Development

00:03:22
Speaker
Thank you very much. It's very thoughtful of you to to read our blog posts and as well as to invite us to be a part of this podcast. It's a privilege to chat with you. you know In that setting, and in the interest of full disclosure, I worked with the Male Contraceptive Initiative prior to starting Noodle.
00:03:38
Speaker
And what I learned there and what we carried forward with Noodle was this understanding that, one, you have these critically important missing therapeutics, male contraceptives, that can really make a significant societal impact, not just in empowering individuals to have reproductive autonomy, but also to address things as pressing as unintended pregnancies, which about half of all pregnancies in the world are unintended.
00:04:03
Speaker
So you're talking, this will have a huge, huge impact globally and touch on many, many different things. But what you also realize is in this kind of product development space,
00:04:15
Speaker
It's kind of far future, right? So a lot of the research and development will take years before they actually get a product to market. So okay to give you kind of a quick down and dirty example of why that is, is the clinical trial process for drug development is rigorous and robust.
00:04:34
Speaker
know So let's say you have a method that's going to long-acting contraceptive that is effective for a year. yeah Well, you have to test it for as long as you are going to claim that that drug is effective.
00:04:47
Speaker
So if it's a year long, then you have to go through, there's four different phases of the clinical trials, they're very expensive, you have to keep increasing the number of participants. So a lot of the developers working in the space, these are scientists, these are bench scientists who are focused on that scientific challenge. you know how do we meaningfully disrupt fertility in a temporary manner to empower people to prevent causing a pregnancy if they so choose.
00:05:13
Speaker
yeah So they're thinking solely on the science side of this. And it takes years, and it depends on the metrics you look at, but it can cost upwards of a billion US dollars to bring a new drug to market. wow So it's a massive, massive capital investment. don where we come at this challenge is thinking, okay,
00:05:32
Speaker
So it takes years to develop these products. They're going to have a massive impact, but it's going to cost a considerable amount of money yeah to create these. What we argue for and what we try and represent through Noodle is in parallel, you should be investing in understanding things like user preferences, pain points, fears, concerns. yeah And also in the example of male contraception, there's a very specific challenge in that.
00:05:58
Speaker
the understanding of male fertility and male reproduction is quite limited vis-a-vis what is understood about female contraception. And that's scientifically, not even behaviorally or attitude-wise. You mean scientifically, it's not very well done. Exactly.
00:06:15
Speaker
Oh, okay. And that's just historical lack of funding or investment in that research, basically. Yes, and one of the primary drivers of that, too, is you think about Hippocratic Oath that all medical doctors take, right? And that's, ah first, do no harm. yeah So one of the challenges that we have, that that entire field has, is that In the case of female contraception, women who are experiencing a pregnancy face physical risk, up to and including death. So you have ah higher threshold for side effects because ultimately, if it falls short of causing death, you're preventing ah greater harm from occurring.
00:06:57
Speaker
Whereas in the case of a quote-unquote male or a sperm-producing individual, there is no physical risk from ah pregnancy. So that historically has led to some of the focus being women because they are the bearer of the children.
00:07:13
Speaker
So what we really argue for is like you really need to think about not only as you're developing these therapeutics, the method in which they are ah consumed, whether it's a pill or a gel or something that's injected into a body and really understanding what people's kind of threshold is for those different methods of not only taking the therapeutic, but also how it works and how long it works. So really developing that robust understanding of what people want, why they want that, and how you can meet that as you're developing these is critically important so that when the first methods make it to market, you're not doing that sensitization, you're not doing that kind of product profile work, you're actually doing that in parallel. So as you're developing the drug, you're also doing that kind of messaging, that education, the marketing sales that's required to really get people excited
00:08:06
Speaker
And then also when you think about limitations in funding, by also building up that interest and that knowledge of it, you can create kind of a movement behind it and you can champion and show that there's a market so then funding will follow the market.

Cultural Insights in Biodegradable Implants

00:08:20
Speaker
yeah I've got a specific example, if I can just add it. yeah Akshay, I was working at FHI 360. This is actually for female contraception in this instance, but I think it kind of echoes what Kevin was saying a little bit.
00:08:33
Speaker
In our experience as researchers, I personally do find that I often come across developers who have a fixed idea in their head of what they think will work and what the market needs. And then you end up maybe sometimes trying to retrofit that according to the market. And also remember, the market is very diverse.
00:08:49
Speaker
Larks are very popular, so long-acting kind of contraceptive methods are very popular, including the IUD in Southeast Asia. They are not in Sub-Saharan Africa. It's just there's many cultural nuances that need to be understood when designing, developing, and as Kevin noted, marketing. So a few years back when I was still at FHI 360, and even after we started Noodle, I remained on this particular project, but we were looking at the acceptability of biodegradable implants in Sub-Saharan Africa. So in this case, it was Kenya,
00:09:19
Speaker
and Senegal. And these are products, there are two different products that are being developed. One that is quite similar to the kind of existing implant, which is permanent and needs to be removed. And the other one, sorry, the current implant needs to be removed.
00:09:32
Speaker
One is a little bit more like that in that it's kind of in the shape and quality of the material. The other is the far more porous kind of brittle material that biodegrades and it's different pellets that are sort of lined up in the skin.
00:09:44
Speaker
So different kind of products. And we went to groups of of women and and and other, we we talked to service providers, some doctors as well. But again, they were creating these products because they felt in their experience, removal is a huge barrier to use.
00:09:59
Speaker
So women find it expensive. It's difficult to remember when they need to go back to get their implant removed. And doctors often find it difficult to remove. So you have this whole kind of 360 view of challenges arising in this 360 view of provision of this particular product. So Product developers said, well, if it biodegrades, then it doesn't need to be removed. So problem solved.
00:10:19
Speaker
And what we found when we went to Senegal and Kenya is that the fact that this product biodegrades caused many concerns for many women for many reasons. And what may seem logical and a problem solver to one particular product developer or one particular market may not work for others. And so we did find a lot of hesitancy around this product, both for service providers worried that they weren't going to put it in correctly, and women worried that it would have impacts on their body in terms of what is that material that's dissolving in my body? What might that do to my hormonal system and my ability to, you know, my fertility long term? So
00:10:54
Speaker
It was a really interesting product. And these are quite far stage at this point. So it's also interesting, as Kevin said, i mean, our advocacy is always about trying to bring in that user research far earlier in the process because it's a lot of money that goes into this. And I'm sure there are going to be products that maybe don't go to market because they haven't really been tested in the market. So that was a huge part of this poster that we presented. And it was certainly, I think, a ah little bit isolated in that most of the posters in this particular conference were more scientific and they were on products or case studies, et cetera. But we're trying to remind them how important this piece is to product development and understanding user preferences, acceptability, as Kevin says, and as we talked about, just this idea of adoption.
00:11:36
Speaker
They're more likely to adopt something that they have co-designed with product developers. But if you do that in a vacuum, it's going to risk lack of adoption and a lot of money being lost.

Value of Qualitative Research

00:11:47
Speaker
Speaking of adoption, what was the response?
00:11:50
Speaker
I'm always curious about that, right? Putting something up like that, what did you feel response was? What was your expectation? What actually happened? I think the one challenge you always have is kind of convincing people like why qualitative versus quantitative, right? yeah And having a poster like that where you can kind of really quickly demystify or or elucidate the process and showing this is something that you do and this is how it matters or why it's important.
00:12:19
Speaker
um That always helps, but also that kind of interactivity of a poster presentation super helpful because you can do that one-on-one address all of issues. But I think a lot of people saw, and one of the examples that we actually used, which helped not just justify it, really validate the approach, is you know there was an international market research study that looked at all these different countries, and it was all quantitative.
00:12:40
Speaker
And what we found was in Bangladesh, people there really wanted this injections, this fast-exclusive gel injection. And that was an outlier from other countries. Other countries, its the preference is more for like a daily pill or an on-demand type of a pill.
00:12:55
Speaker
And so you get this kind of the what's from quantitative, but you don't really get the why. so So what we then did is following up with folks that are representative of the um the people surveyed in Bangladesh. What you discover is that in that ecosystem in Bangladesh, the quality of pills is very low.
00:13:13
Speaker
So there's a lot of distrust for that specific form of a therapeutic. So really that being able to show a specific example of how if you had not done the qualitative piece, you would have missed out on an incredible insight.
00:13:27
Speaker
And you would have just treated that as a outlier in the data because you're looking at global data and said, yeah, that's okay. Everybody else is conforming to this. That's an outlier and that's a whole country.
00:13:39
Speaker
Total anomaly. Yes. cities Yeah. So that's the thing. I mean, above and beyond when we say qualitative research, I think we should kind of expand upon that. So at Noodle, we're thinking things, not just in-depth interviews and focus groups, but also ethnography, anthropology. You know, how do you get that as I was saying that 360 view, that holistic understanding of context, right?
00:14:02
Speaker
that you're trying to work in. And that's where you get this kind of like grounded theory. Like you don't go into it with a hypothesis, you go into it seeking a depth of understanding and seeing what drives these behaviors. And something that may not even be considered to be quant survey capacity could really be teased out by spending time in a community and actually literally walking in a user's shoes.
00:14:25
Speaker
I mean, going back to the poster as well, I think it helps as well when you start bringing in things like journey mapping and persona development, not just for that strategic ideation across like a journey map, but also understanding the touch points that drive what people do and when and why.
00:14:41
Speaker
But then when you start layering in things like personas, Then you have the opportunity for people to really do some interesting product ideation and wearing different hats. It's like forced empathy or something. Like you have to try and think like, OK, I was this person in this context with these realities. How would I actually behave? And so I think that's where you start.

Client Engagement and Insight Application

00:15:00
Speaker
And that's already such a shift, right? like I think Anna was mentioning that. A lot of product developers tend to come in with a sense of, I know what's going to work, what is the right format, what's the right go to market, etc.
00:15:13
Speaker
And I think by kind of setting up all these different ways, they're almost like lenses or costumes you put on to be able to put yourself in that situation. So I'm curious about the next step in the process.
00:15:28
Speaker
You finish your research, you have the data, the insights, and you have your outcomes in these various formats. But then there's this handoff to your client, right?
00:15:38
Speaker
And then it's their responsibility to go from insight to action. And you have limited agency there. And so I'm curious to know how you navigate that. How do you ensure that action happens in a certain way?
00:15:55
Speaker
I can jump in first, Kevin, if you want, only because my mind went straight to a lot depends on the client or the audience that you're making the or visualizing the insights for if they are visualized. But I'd mentioned my work with FHI 360 earlier and a lot of the outputs that they desired and that funders desired were publications, which typically are very text heavy.
00:16:16
Speaker
light on infographics or maybe charts or a graph here and there, an illustration of a particular product if it's a contraceptive method sort of study. But there's strong legitimacy in that particular piece because it's peer reviewed.
00:16:29
Speaker
It has been thoroughly vetted. It typically follows a very systematic structure in terms of its development and the analysis of the data as well. I think I may have mentioned to you in previous conversations, Akshay, that I was sort of brought into FHI to help in some ways temper the rigor and robust and lengthy and expensive nature of the qualitative research that they conducted and trying to insert and inject a bit more human-centered design, rapid ethnography, kind of a more innovative process that's a little faster, a little bit more creative, and a little less sort of, again, robust and rigorous. So in some cases, our outputs weren't necessarily a publication, but I think in most cases, 99%, that is one of the preferred outputs because, again, as I said, it it has such legitimacy and power in that particular community. But
00:17:13
Speaker
as Kevin can attest, are clients that might be in the private sector or even in the nonprofit sector, but need the insights to be able to disseminate amongst their stakeholders. We tend to find we are working a lot more with visualized data, personas, journey maps, infographics that kind of encapsulate a lot of the rich qualitative data that we've collected and the insights that we've developed in ways that are powerful visually and succinct, unlike a publication in most instances. So it's not always very thorough. It's often quite distilled.
00:17:44
Speaker
and bulleted sort of takeaways insights are in interpretation of the most salient points. But I'll let Kevin also mention or piggyback on that if you have thoughts on different types of outputs that we are asked to produce.
00:17:56
Speaker
Yeah, absolutely. And I think one thing that's really and inherent challenge we always have, and what you just said, Akshay, too, with like it's a fairly common thing, is that clients may come and say, we have a specific idea in mind, or we want you to test this product.
00:18:13
Speaker
that we've got a prototype of, or we want to understand, we think this service innovation is going to really move the needle with our customers. So get their feedback and provide that to us. And what we try and challenge and what we push back in is to say, all right, first of all, let's pause. Before we talk to anyone outside of your organization, we need to speak to all of the different representatives of whether it's marketing, sales, the C-suite, customer service, the engineers who are developing products.
00:18:42
Speaker
And really sit down with them and say, okay, what do you need out of this? What information do you need out of this? Yeah, yeah. And then also, I mean, to an honest point with like these presentations we give, we always have these detailed findings reports which are, you know, 50, 60 pages long, distilled into like a 10-slide presentation. yeah And what we always include at the end of these presentations is some ideas based on our understanding of these internal stakeholder needs on how this research can be operationalized by each of those kind of pillars as well as collectively.
00:19:14
Speaker
Sometimes that's seen as some sort of a sales and marketing The thing on us is like, oh, we're always kind of going after new business. So what we've been really mindful of the last few months has been to preempt those concerns by starting an engagement by saying we can do one off research for you.
00:19:34
Speaker
Yeah. However, the greatest benefit is to look at this as developing a continuum of understanding of the stakeholders you're most interested in. So again, going back to like grounded theory, let's go and understand what they have to say. Let's unpack that with you.
00:19:51
Speaker
And then let's iterate on that. So if you think about like the five step process of design thinking, sometimes we get stuck after like just this empathizing, right? So yeah it was we've done the research, here you go, and it goes into a black box that we don't have a window in into, and we hope that the client operationalizes it.
00:20:10
Speaker
But we really try and push more and more is, let's just go through iteration. Let's start with a small catchment of people, maybe 10 to 15 folks, hear what they have to say, and let's expand upon that and potentially go back to folks and say, hey, we heard you.
00:20:25
Speaker
This is what we think you said. Here's what we think could be a solution from this. And to Anna's point, if we can do that with the users or the stakeholders we interview in a workshop environment, then that's golden.
00:20:37
Speaker
So then they really have an active voice. And there is power solely in going back to someone and saying, hey, we heard you. yeah And having some of that, wow, my God, we get these surveys and we get these interviews all the time. We don't know what happens with it.
00:20:53
Speaker
and and You actually came back and are showing that we heard. But the hazard of that, and this is what we always warn and caution our clients with is, once you show that you care about what they have to say and they've shared, they've been vulnerable with you, you better do something with that.
00:21:08
Speaker
yeah Because otherwise, it's a slap in their face and you're just kind of doing paying lip service to the challenges, especially in a corporate environment that may be a frustration about a product feature, of failure, and a service innovation.
00:21:21
Speaker
But in the development sector, that can mean lives, right? So you know we're not listening to you. We're not incorporating that. And so we're not going to roll out interventions that are rooted in your lived experience know and everyone's fault and failure, right?
00:21:35
Speaker
Yeah. I was just going add, because I feel like part of the thrust of this conversation and conversations we've had in the past is like, what is the value of qualitative research?

Breaking Organizational Silos

00:21:44
Speaker
And it's different for, again, different audiences and clients. And I think we talked about earlier that we often find product developers who don't necessarily think they need it, or maybe they bring it in a little bit too late.
00:21:55
Speaker
And this is for contraceptive product development, but product development in any sector or voice of customer, we're still getting those people coming to research with in some ways, a fixed idea of what they think the solution is and what they think the scope of the particular question is.
00:22:09
Speaker
And the other thing you find, which I think is probably par for the course across different organizations, be it an INGO or corporate global corporation, are these siloed roles and divisions and departments, et cetera. And they typically are the ones funding research. And so their scope for that particular challenge is limited to their very siloed role. And what we find in 99% of those challenges is that it's so much broader than that. As Kevin, as we were saying earlier, this need to pull out and see all the various actors and stakeholders impacted by the challenge in question.
00:22:41
Speaker
You're doing voice of customer research. Well, of course, you need to understand what sales and marketing are experiencing when they're working with a customer or fielding frustrations or account managers, et cetera. And you often find a little bit of a myopic siloed view on that and a limited view as well, that if we just pinpoint this little piece and do a little bit of research on that, that'll be enough. And we often find ourselves trying to advocate for a broader view, not because we necessarily are trying to upsell everything, but that we really see the value in that. And that if you're just going to go quick and dirty and go shallow, what's the point?
00:23:17
Speaker
you may as well try to get buy-in across all the various departments and divisions and stakeholders that are part of this larger question. I mean, sort of integrated development and research would be ideal, but it's hard, I'll say in my experience, to find that kind of enlightened client, customer, audience, et cetera. And it's expensive, but getting that sort of looking at it a little bit more holistically and understanding all the various actors that could benefit from co-designing a solution is part of the work we do. It's educating folks on the value of qualitative research to see the bigger cultural picture at play.
00:23:52
Speaker
I think this idea of silos is something that is, it seems to be a theme, whether you're talking about development sector work or private sector, it seems to span regions as well.

Facilitation and Consulting in Research

00:24:07
Speaker
It seems to be a function of just how we manage, how we organize things. from this sort of industrial way of doing things, you break it down and yeah everybody's building widgets and that kind of thing.
00:24:20
Speaker
ah Henry Ford and all of that, right? Mm-hmm. But how do you then overcome that? Especially kind of curious how it compares between development sector and private and even in the development sector, when you look at large bodies, funding bodies or larger research organizations versus smaller nonprofits working on a particular issue or programmatic.
00:24:45
Speaker
kind of work. How does that differ in terms of their appetite to try and do some of this? Where does your role then stop being qualitative research and start going into all these other sorts of roles, facilitation, organizational development, change management?
00:25:05
Speaker
What have you? Consulting. I mean, more broadly consulting. Yeah, yeah, yeah. It's funny. When we first launched Noodle, we just referred to Noodle Research, right? And now we've added Noodle Research plus strategy to dick kind of call out the fact that it is important that we're not just data collectors and data synthesizers, but we're also stewards of the insights that we generate.
00:25:30
Speaker
No, that's beautiful. I like that phrase. Yeah, thank you. I just put that on my visiting card. But, you know, when you think in a best case scenario, what we strive for is to launch any engagement with a client with a facilitated workshop experience, with as many stakeholders, internal stakeholders as possible, to unpack almost establishing a baseline of where are you at?
00:25:51
Speaker
Where do you want to be? And how does this research work? facilitate achieving those milestones to getting you to that ideal state. But Kevin, just a point there, right? yeah Like, where are you at? I feel like what y'all have also been saying is that you're sort of poking there and saying, where are you really at?
00:26:10
Speaker
But in my experience, that forces a certain level of discomfort. And then how do you navigate that? Because people aren't used to being asked the question below the surface of the ah RFP or whatever scope of work that you've put out. And sometimes people would say, yes, yeah, we want this.
00:26:29
Speaker
But then when they're in it, it's a different experience. ah Sort of like how I feel on rollercoaster. How do you navigate that? A lot of times, when you know it's it's such a terrible thing, to and I hate to be so reductionist, but when you start thinking about cost and value. So when we propose to clients, then we'll do the research, is always externally facing. So hey, to do 10 to 15 IDIs, unpack this, it's gonna cost you X. However, we're also going to do these internal stakeholder conversations for free.
00:27:00
Speaker
So when there's no costs associated with that, there's a greater appetite, there's less friction in entering into that. And then all of a sudden they realize, oh God, now you're asking me very uncomfortable questions. But we've shown this in the past with some of our clients that it actually reframes the research mandate sometimes. Because they again, you're helping break down hypotheses before you're even engaging with an external stakeholders. So hey, you may think that there's a product feature failure that's out there, but it may actually be a customer service issue or a sales and marketing challenge. Yeah.
00:27:31
Speaker
If you improve the packaging, then you might make the product more appealing. So it helps with that. So it's like, hey, we didn't have to go through three months of research to identify this. We've now helped address that already because you guys have not had those internal conversations yet.
00:27:46
Speaker
You're seeing more and more of these kind of centers of excellence and stuff that help champion stuff internally. But we also wear that hat sometimes. ah let Go ahead. No, was just going to say, we often speak in terms of sort of research literacy or assessing our client, a partner, collaborator, in terms of their research literacy, meaning how much do they see research as being important and a little bit more kind of, again, holistic and pervasive. And it depends on the client, but I think something we have noted is that sometimes it takes time, right? It's a relationship you build through repeat work. And that's when
00:28:17
Speaker
you can start to work on their sense of the value again of what you're doing and maybe pull out a little bit and include other things and hopefully in the end, save them money by developing a more comprehensive research approach, as Kevin said. So one of our repeat clients has a center of excellence and there's tons of silos and tons of departments and tons of products and divisions, et cetera.
00:28:38
Speaker
And they have all these very siloed research projects and their sort of mandate it is to somewhat aggregate that data and kind of use it more broadly across the organization and create more of a global or pan organization kind of strategy. And then they also tend to take on the internal piece. So their interpretation is that they're better off talking to their employees than independent researcher, although I think we would certainly argue yeah differently.
00:29:02
Speaker
But it's that piece, too. It's like sort of educating your clients and trying to convince them of of the value of research, more broad and more deep research. But it's not easy. It really depends. And I mean, as having a center of excellence is already an indication that they do value a little bit of that kind of integrated approach and aggregating data and doing research. But also, I mean, we've experienced with other clients as well when they have a consumer insights department or when you see that there's a little bit of positive hope that there may be a little bit more value associated with research, but it can be very difficult and qualitative in particular because it takes longer.
00:29:35
Speaker
It typically is more expensive and it's a little bit more squishy. It's quotes and it's insights that we've developed. It's a little more esoteric, the process that we come to those findings. I think with quantitative data, they love it because it's statistically know proven or there's not hard and fast numbers and they like their little bar charts. And so- But as Kevin was noting, but what we struggle so much is like, but why? yeah yeah You might say X, Y, Z, consumers or end users want this, and there's never that Y piece, which is maddening. So I feel like a blended mixed methods study is always a great idea, depending on the particular challenge at hand.
00:30:10
Speaker
It's not always required or necessary, but yeah, to answer your question, part of it's just developing that relationship and time. Yeah. I just wanted to jump in here and say, I also feel compelled to share too that one of the reasons when we first started having conversations about what you guys were doing with the Open Knowledge Framework and allowing for access to more participants or players at our internal stakeholders to unpack data what we often see with one of our big repeat clients is there's research exhaustion we'll talk to participants and say god we just spoke to someone just like you and shared these same exact sentiments what's being done with that yeah and it's expensive to do these constantly the redundancy of research so when you think again like
00:30:54
Speaker
Using the development sector as an example, someone may go into a project focused on you know maternal health And they're focused more on like healthcare care

Redundant Research and Insight Sharing

00:31:05
Speaker
provisioning. yeah But they're not talking to the nutrition team. They're not talking to the water and sanitation team. They're not sharing that information that they've gleaned with those other teams. So the serendipitous moments of innovation are there because yeah you have a singular perspective and you're only thinking of it in that perspective when someone who's wearing a different hat may look at that same data and go, oh my gosh, that's incredibly important insight for the work I'm doing.
00:31:30
Speaker
So again, that's where you start showing yeah And the population is the same. Totally. Yeah, exact same. yeah And in some cases, people are going and meeting the exact, literally exact same people, right? Literally, yeah. And that's also an experience.
00:31:45
Speaker
Like there was this other group that came and met me a month ago and asked me sort of similar questions. What happened about that? like yeah That silos across organizations in the same domain, different domains in the same organization, all of that.
00:32:01
Speaker
Yeah, yeah. Well, and i think about when you and i years ago worked on that project Zaman, the big sanitation project when I was in Bhubneshwar for all those years, I would actually get asked by some of the funding organizations to help researchers from other countries yeah you know just navigate the slums in Bhubneshwar and help sensitize them to the area.
00:32:21
Speaker
And you have to empathize with research participants or these populations that are being focused on for these various research initiatives because yeah it turns into like a human zoo.
00:32:33
Speaker
You're in people's communities, you're in their neighborhoods, you're in their homes often. and And you're expecting them to bare their souls to you and be extremely vulnerable and share this information. You better respect that and have a deep understanding of how intrusive that can be and how challenging it is to be that vulnerable of a complete stranger who's going to swan in, you know nothing about them.
00:32:54
Speaker
They bare your soul to them. And then a few weeks later, you look around and well, what happened to that guy? you What's going on? And so... I think there's also that the respect for the people who are giving you these incredible insights has to be at the forefront of your mind and thinking about how you maximize the information that you gather in any one research initiative. So again, thinking about your open knowledge framework and that kind of impact multiplier that it represents across so many different stakeholders. And again, like we give multiple presentations of our research findings. Mm-hmm.
00:33:30
Speaker
And half the time people are like, ah, sorry, I only have 10 minutes. I got to go. And so we'll record it and share. Maybe they'll watch it. yeah But you also have people who are really into it, really want to dig into the data. So yeah the more that we can create these resources that allow for asynchronous exploration of data and research and kind of serendipitous cross departmental ah collaboration, that's where you're going to see real impact and real innovation.

AI's Influence on Research

00:33:57
Speaker
That's where the magic is, right?
00:33:59
Speaker
I'm curious about this, you know, the last two and a half years of people engaging with the explosion of these AI tools, right?
00:34:10
Speaker
After ChatGPT went public, I think it's changed the way people engage with large text. I mean, there's a lot to worry about on that front and a lot of things that aren't going in the right way. But do you see a shift in terms of how people approach people?
00:34:31
Speaker
the raw data that you might share or even somewhat processed data like you're talking about, Kevin, even the larger people in the middle of the bell curve, as it were, do you feel like there's a shift there in an appetite to engage with ah richness, even if it is summarizing it and things like that?
00:34:49
Speaker
What are your takes on that? Because we are experimenting with integrating AI in very specific ways. so as not to flatten, but to engage, if I can put it that way. And so just your thoughts on that.
00:35:06
Speaker
I was thinking, Kevin, we recently completed a project and we found that one of our clients, one of our point of contact clients, main client effectively, was in some ways doing concurrent or simultaneous analysis. So they were I think in some instances, our insights and findings weren't quite fast enough for some of their top lines that they needed to sort of share internally.
00:35:27
Speaker
So we knew that in some instances, they were effectively taking the transcripts that we always dutifully shared sort of day of as soon as possible. We sort of uploaded those to a shared drive and they were putting them into chat GBT or think it was actually chat GBT in this instance yeah and doing their own sort of rough and dirty or quick and dirty analysis.
00:35:45
Speaker
And no worries about data security and all of that or? No. The beauty is that with our transcripts, we tend to anonymize them and sort of de-identify them. So in this instance, there wasn't, I don't think, too much concern about that, but it also didn't seem like there was too much concern about the quality of the insights being generated or maybe going back and double-checking. So I'm fairly certain, we having actually conducted the interviews, that's part of the beauty of us staying so small as we become so so immersed in the data.
00:36:12
Speaker
ah myself maybe to a fault, that I can immediately see like, oh, that person didn't say that, or no, that's not a correct kind of through line or takeaway. So we are sometimes in a better position than someone who is just taking the data and quickly assimilating it But that's an important point, right?
00:36:29
Speaker
Because, i mean, heck, I do this with some stuff, like where it's too long to read, let me throw it in and I'll take what counts as truth. But I think what I'm also hearing is that there is something lost when you're just working with transcripts.
00:36:43
Speaker
And that is a sort of the human eye also of having that context. And especially using a standardized tool like ChatGPT, which is doing everything from diagnosing your cough to writing essays for grad students to what have you, right?
00:37:01
Speaker
Writing novelty rap lyrics, right? How do you then navigate that? Does the other person then have the confidence to say, no, the ChatGPT insights are right? And how are you navigating that now?
00:37:13
Speaker
Yeah, it's a challenge. I think that that's something that all research organizations are grappling with right now is, and I just think, expanding beyond research organizations, everybody is kind of grappling with like, what is the most appropriate use case for each of these tools? So there's an anecdote about some lawyer who used JAT-DBT in a case, and he's citing I think cases that don't even exist because he doesn't understand how this this thing works.
00:37:37
Speaker
I mean, honestly, I think what from day one with us is we are overly transparent. and And we even had a client say that we're over communicative. You know, when we're in projects, we do daily top line. So if we're doing interviews, we're sharing takeaways that we generate.
00:37:54
Speaker
And then we're also sharing the recordings and the transcripts. And then we have a weekly update of all of the learnings from that week. And then we usually typically have weekly meetings. And I always say that before we get to that final presentation, you will know everything that we've learned because we're going to share it as we go.
00:38:11
Speaker
This like those movies where trailers have the entire plot. exactly Exactly. I don't need to see the movie now. I know what happens. Yeah. Hey, we're a company. We're putting our expertise on the line and saying that we can do this work for you.
00:38:27
Speaker
we welcome the challenge of people wanting to unpack. We love it when people go through the data and we have these conversations and we kind of collaboratively, you know, they can question our insight generation, our suppositions or whatever, and that's a nice dialogue. And at the end of the day, the most important thing is to get the information correct.
00:38:49
Speaker
But what can be challenging, and to your point, is sometimes someone may throw things into an AI tool and then start questioning, well well, AI says this. And so it's like, well, you need to immerse yourself to the same degree in the data as, maybe not as much as Anna, because I don't get to that degree either, but she becomes a real like subject matter expert.
00:39:09
Speaker
And if you meet us with that same level of rigor, then that's a welcome collaborative thing. If it's a some sort of a monitoring and evaluation tool over the quality of what we produced, I think it's fraught because your understanding is superficial vis-a-vis what that AI device is generating for you.
00:39:31
Speaker
So it's hard for you to substantiate its findings when you have not immersed yourself to the same degree as we have. So I don't know. I think it's a brave new world, right? Like it's, I will say the interest of full disclosure for a small organization like us,
00:39:46
Speaker
and AI has been quite revolutionary in helping with just being that sounding board or whether it's producing marketing collateral or blog posts or things like that. Or summaries.
00:39:58
Speaker
Yeah, summaries. It's a good kind of reference point. and So it it does kind of help small organizations be more nimble. I just think that its application needs to be understood. I think where we should get to is that organizations should have some sort of an AI policy.
00:40:15
Speaker
when do we use this why for what reason what is it helping us avoid or or achieve and which tools and which yes yeah exactly right the thought i had when you use that law example right and i think chat gpt as a general purpose tool is fantastic and it's great like you said small teams like ours as well it's fantastic to get that first draft So that you're not looking at a blank page. It gets you the first draft.
00:40:42
Speaker
Sometimes you just change everything. But at least you're not stepping at a blank page. yeah It gets you off the box. But I think the law example you took, using ChatGPT for that is flawed. But then you have a whole bunch of companies now in the legal tech space. Harvey is a big one in the U.S., which are use case specific, and they know then how to build the guardrails, they know exactly who's using it for what, then that becomes important then also to say that are you creating access for the right sort of tools? Yeah, yeah.
00:41:17
Speaker
It reminded me a little bit of that kind of critical lens or the critical mind that one needs when looking at social media content or to really think, is this true? Is this real? Like, where did this come from? And i don't know how much that happens with, well, even researchers, but certainly non-researchers who may take it at face value and assume that whatever has been sort of generated through the AI yeah tool is correct. And so, I mean, I know one of the other things I find it very helpful for is finding quotes. So as Kevin said, I'm often like, I remember this person said something about this and I want it to be illustrative of a particular insight or finding.
00:41:50
Speaker
So I seek it out. And there have been many times, depending on the tool I'm using, where they either fabricate a quote and then I go in to find it in the transcript. I'm like, they never said this. so yeah And then I'll even sort of challenge them on it and they're like, oh, I'm sorry.
00:42:04
Speaker
That was my mistake. you And again, I just worry that that level of oversight and double checking and triple checking doesn't happen in a lot of circumstances. And I know we're not all curing cancer with every research project, but it's important to get it right.
00:42:16
Speaker
Yeah. As a researcher, I feel like there's integrity in this and it needs to be vetted. Yeah. Currently, the tools are not sophisticated enough to be able to do things without, I would argue, that lens of checking and vetting.
00:42:28
Speaker
Let's change tracks a bit now. I'm really curious about how in these larger organizations, which have usually relied on very traditional methods of research, those processes, how has your experience been ah around bringing in human-centered design tools and methods and what really has worked in those being accepted?
00:42:57
Speaker
I'd love if you can unpack that a bit. Mm-hmm.

Human-Centered Design in Global Contexts

00:43:00
Speaker
Well, as I mentioned, so it sounds as though funders, some of the main ones in global development space, USAID and the Bill and Melinda Gates Foundation, et cetera, there was this growing appetite for human-centered design, which was kind of growing in usage and sort of the tech sector and certainly digital spheres. And so this idea of bringing in these innovative, more creative as well. So, I mean, it's not just...
00:43:22
Speaker
It's about eliciting potentially more unconscious, different kinds of information. It's less sort of surface. One could argue it goes more deeper into sort of latent thoughts and perceptions and demands. So we were both kind of selling it as quicker,
00:43:37
Speaker
cheaper, more creative, a little bit deeper, a little bit more effective in, again, really kind of getting at the various questions at hand, whatever they may be. But there was a lot of friction. There remains a lot of friction.
00:43:48
Speaker
It wasn't desirable or appreciated or considered a valid form of research by many people. i think it still isn't, primarily because there's a decreased level of documentation. There's less of a group process involved. So again, I would think we talked about maybe sort of intercoder reliability is a big thing when you're coding transcripts and using a more kind of an in vivo coding software. I'd be interested actually for you to talk tell us the various coding tools that you've integrated into your software, but it's very laborious and it tends to be teams of several people that then come together to double check that you all interpreted the same line of every transcript the same way. And I mean, it's very assuming and it's very thorough and that's wonderful, but I don't think every challenge research question needs that level of thoroughness and and being tracked or detail oriented to that level of minutiae.
00:44:35
Speaker
And the other argument is that, I mean, one of my arguments was to have embedded teams. I always, as an ethnographer, like to be in the field as part of a team collecting the data and doing the research. So I was also always advocating for that. So to be there, we often find in global development, previous to kind of working with human-centered designers or design researchers, is you would train typically people maybe working in service provision, nurses, what have you, on the front lines, frontline workers who have no qualitative research or research background, you kind of do a quick training on how to ask open-ended questions and give them 40 questions to ask, like a survey, and it's not open-ended, there's no follow-ups, there's no f probing. So again, like the quality of it was frustrating. It's
00:45:15
Speaker
There's no following the thread. No, there isn't. Because you understand why the data is being collected. Exactly. And you're just, well, on to the next question. And then half the time you probably already asked it because they answered it ahead of time. But the particular researcher isn't skilled or experienced enough to know when to pivot. So there were few arguments that we were bringing into this more rapid, creative process.
00:45:37
Speaker
user-centered approach that's iterative as well. And one thing I'll just mention briefly, because it is a big challenge with global development and the kind of research that typically is conducted is the oversight of, say, the review boards, the kind of ethical review boards.
00:45:50
Speaker
That's very necessary. It's very important, but it's also very restrictive. So things that are typically approved and they've gone through that long process of approval are considered set in stone. And if anything changes or pivots, one needs to go back into the process of doing that, which adds time. Something about human-centered design, as we all know, is that it's quick and nimble and it's iterative. So there was a lot of tension with the process, the means of collecting the data, a lot of things. But I do think we came quite a ways towards explaining its value or showing its value, looking at market research in the more kind of typically heretofore private sector application of it.
00:46:24
Speaker
And I think they're still doing it today. Right now, you see a lot more human-centered design projects. yeah in sort of global development. And they seem to have found a nice way of measuring as well. So I feel like there's been shifts on both sides. Like they've both kind of found a middle ground that is generally satisfying for people.
00:46:40
Speaker
and don't know that answers your question, actually, but it's been a process. Certainly, certainly. And I guess it's a it's an ongoing one as organizations also mature as they do things, some things work, some things fail, champions move on.
00:46:56
Speaker
That's a big part of it as well. and I think one of the trends that we've spoken about before, but I'd love to kind of hear your thoughts now on this also is there can be a neat binary of saying that either somebody comes in, and like somebody comes into a country, does the research.
00:47:13
Speaker
or you're training nurses. But way I've seen some development organizations, research organizations also navigate this as partner with trained qualitative researchers, because now even in so many countries in the global south, there are trained designers and researchers and studios with as rigorous processes that have come up in the last 20-25 years. And I think we're starting to see that. 100%.
00:47:40
Speaker
What's your view on that? Like what's working? Where does it break down? Just curious to hear the thoughts on that. Oh, that's a great question. I was just going to say, I mean, actually, many of the last projects I worked on, particularly post-pandemic or both during and post-pandemic, naturally shifted in this way. And so as my time at FHI continued, we were noting and noticing more human-centered design agencies cropping up in the countries and regions that we were working in. So we wanted to leverage them and we would do the same sort of procurement process where you're vetting research vendors.
00:48:12
Speaker
and found ourselves working more closely and directly with these human-centered designers that are located in these areas, which is obviously ideal because they're both they're culturally sensitized to the particular context, but they're also skilled researchers and not even just traditional qualitative research, but in these more innovative and creative methods that we're talking about. So we did work with them in many instances and had to.
00:48:34
Speaker
I still would have loved to have been on the ground with them because I do find, again, because I value so much my own kind of visceral, bodily data collection and being immersed in the context and having those conversations with my co-researchers. So I still value in-person research and first-hand field research that I'm able to access, but it was a game changer for us in many ways. And what I loved about it as well is that many of them did come potentially or you know in some instances from a more design background.
00:49:00
Speaker
And again, just as I sort of had to find the middle ground and I was building capacity within FHI 360, we were in some ways building capacity within these organizations to work with INGOs. So um it was kind of a cross, obviously, but we were both helping each other understand this new domain that was being forged between the world. So it was great. I think sometimes there was discomfort there because we were asking things of them that they didn't usually provide as a design agency. There was a level of transparency and a desire for things in a manner that maybe they weren't used to. So there was friction there, as maybe you'll have with any kind of agency or vendor. But yeah I did find that there's more engagement than they were used to. I think typically if they're working with clients, it might be a little bit more hands-off. But
00:49:43
Speaker
in the development sector and certainly with INGOs like FHI 360, we really sought to be partners and have as much access and transparency as possible. and It was great, the language and the cultural skills and context and all the various knowledge that's critical to have in those kinds of circumstances is there in those researchers. So it's the ideal.
00:50:03
Speaker
I think there's a need though that there was a huge pendulum swing, right, from this whole decolonization of research, shifting away from this model of people from the global north or whatever, going into contacts and doing the research themselves and unpacking these things too.
00:50:19
Speaker
then swinging to the other extreme of just, no, it's all local research, all local agencies doing that research and feeding it back to whomever the funder is. you And I really do think that we need to have ah hybrid approach to this, right? And I think that's where it's really powerful to have people who understand the local context and the local culture and rituals and all these things and ways of navigating conversations and certain sensitivities, but also to have that neophyte presence, right, of someone who can step back and challenge presumptions, challenge
00:50:52
Speaker
just these ingrained notions of how things are done. And I think it's a huge thing, especially when you're talking about a funding agency is coming from a different country and operating within another context. It's just that working style.
00:51:06
Speaker
What is the expectation? What is the level of reportage that's needed or the level of engagement back with the funder? So, you know, we often say this, that we strive to be both a teacher and a student always. Okay, say more.
00:51:20
Speaker
So being someone who could come in and possibly provide that training, maybe you're working with, on this example there, where you're training someone who has not done research before, who has not worked in a human-centered design capacity before. sure So you're educating them, but they're also educating you on how to best interact with the community. and yeah i mean, I always think about the first few times going into ah field doing research in India, where we have this time constraint.
00:51:50
Speaker
We're going to meet with so many people. We're going to do some rapid ethnography, but we only have about eight hours in this community. Certainly case in the case of refugee camps in Africa as well, it's a very, very tight window to do a lot of things.
00:52:03
Speaker
And yet, when you first go in, it's like, nope, we're going to go to this community leader's house. that We're going to have a chai, and they're going to talk... And so I remember having these moments of anxiety kind of looking at my watch going, oh my God, we're losing a lot of time.
00:52:16
Speaker
But we're not because that activity opened up a lot of doors and got you that necessary buy-in. So if you don't understand the local context and culture, then you would have potentially offended people. so And then your research is out the window because you violated these unwritten rules that you weren't even aware of. So I think that kind of teacher-student hybrid dynamic is important. And it's also critically important as a researcher for you to meet ah research participants' vulnerability with your own.
00:52:45
Speaker
Be able to say, i don't know the answers. I need you to help me better understand this because we're working. I'm here to try and help address a challenge that's impacting you directly. So because you're more impacted by this, who cares what my concerns are?
00:53:02
Speaker
Let's focus on what you need to share with me. And I think that's where it's very powerful. and I don't know. It's it's scary too, though. like This is an anecdote that may not fit into this

Successes and Challenges in Diverse Projects

00:53:11
Speaker
conversation, but it's just stands out because when I moved back to the States,
00:53:15
Speaker
I was invited to speak on a panel of university, and it was a bunch of undergrad students who were they were learning about human-centered design, qualitative research, and iterative and approach to design.
00:53:25
Speaker
And I was on a panel with three other HCD practitioners, and at one point, we're all talking about our work, and we were taking questions from the students. And, course, the end of the session, the student raised his hand, and he said, okay, I think I get it.
00:53:40
Speaker
Let's say I'm working in a context and I know that the solution is an app for a smartphone, but this community I'm working in, they've never even seen a smartphone. yeah So what you're saying is that I should first train them on how to use a smartphone.
00:54:00
Speaker
And then I can layer in my solution, which is an app that works on the smartphone. Yeah. And the other three practitioners kind of agreed with him well and said, yeah, that's the tension. And you have to understand that sometimes you have the solutions, but they, you need to kind of lead them. Like when it came to me to, to comment on this, so this is,
00:54:22
Speaker
What you just said is an example of why HCD exists. Because your idea is the problem. That you have come up with a solution in a vacuum and you're trying to retrofit it into a community. And if that community does not value the solution, it's their fault. So there is no way, I'm telling you right now, there's no way that that community, if you're practicing human-centered design, would bring up an app if they don't know what a smartphone is.
00:54:48
Speaker
yeah So that's the thing. You have to hand over the responsibility and you have to empower them to be the solutioners. And you have to have them big ears and listen to everything they say and make sure that you get it right and you go back to them.
00:55:03
Speaker
And that's the magic of HCD, that iteration. And ah i just think that that's part of the challenge too, is not only are you as a qualitative researcher, a steward for the insights, but you're also a steward for the process.
00:55:16
Speaker
And that is incredibly true for human centered design. You have to ensure that you are true to its ethos, its approach, and its outputs. So you have to set your biases, your preconceived notions as aside, and just be a blank slate that they can write on, and then you take it back.
00:55:35
Speaker
So I'm just trying to kind of think back to this hour or so that we've been speaking, right? And I feel like there's a few things that are almost coming out as trends that you're seeing and how you engage with organizations, whether they're in the development sector, INGOs or the private sector, right?
00:55:57
Speaker
And I think we've talked about age-old problems, working in silos, inside not necessarily being able to drive action. But then there are these sort of newer trends, right? The decolonization and the swinging of pendulums there. and And I think the pandemic was a turning point there for this sort of thing.
00:56:17
Speaker
But the use of AI and AI tools and that augurs well and all kinds of horrible, right? So we've got this whole mix of these sorts of things happening, trends happening.
00:56:29
Speaker
And I think my question as we kind of close this out, and maybe you all can take it one at a time, is couple of things, right? So what advice would you give to our listeners in terms of like, how do they navigate this as they go forward?
00:56:46
Speaker
And especially given that we're looking at a whole bunch of systemic challenges also. and So how do they kind of navigate that and really kind of comes down to how many organizations are actually asking the why?
00:57:00
Speaker
Given tight timeline, sense of urgency, all kinds of funny things happening with funding, how many organizations are actually asking why and and how do you help them along on that journey?
00:57:12
Speaker
It's the two of you are out there fighting the good fight. But hopefully there's a whole bunch of other people listening in. How do we build that muscle as well? What's your advice for other practices like yours? Because I imagine there are a whole bunch out there.
00:57:27
Speaker
Yeah, that's a great question. and It's a very multi-layered question and it's kind of challenging to answer. But when I think about my own personal journey, my professional journey, which is very, very diverse, and I worked in

Continuous Learning and Collaboration

00:57:38
Speaker
many different fields that were seemingly incredibly divergent, but the through line has always been A willingness, one, boundless curiosity. Always question, always be willing to question your preconceived notions. Like, do not stagnate.
00:57:54
Speaker
Always seek out new opportunities to learn and evolve. And one thing that we've always benefited from greatly is collaboration. Again, that requires a sense of professional vulnerability to admit that you don't have all the answers. And I think, again, with something that sells through the value of human-centered design is you may have cracked the code and had an amazing service or product innovation today,
00:58:18
Speaker
that product or service is not going to resonate a year or two from now. So if you want perpetual growth, it's perpetual learning and perpetual engagement with the people and the communities that you're seeking to serve.
00:58:32
Speaker
And your baseline understanding of those communities is not going to be permanent either. So why are you doing what you're doing? What is your solution? And who are you solutioning for?
00:58:44
Speaker
And making sure that, oh, those three prongs are always refreshing. One bit of advice that I got from our former colleague, Ayush Chauhan at QuickSand, he was critical of me because always wanted to work only in the development projects, the social impact projects. and And he chastised me because he said, you're really missing out an opportunity to grow because some of the things, the analogous inspiration that you're going to get from working in the diversity of fields Something you may learn in UX research or working in financial systems or something may be the catalyst for really new and divergent and innovative thinking in the social impact sector. So always seek out those opportunities to break out of your comfort zone and and constantly learn.
00:59:27
Speaker
and I know if you want Yeah, it's nice because I feel like hopefully some of my thoughts complement what Kevin has said. But thinking back to that challenge you said or the trend around navigating the silos,
00:59:40
Speaker
One thing is you don't have to convince everyone, but finding those champions within the organization or the agency or whatever it may be that you're working with and find that person that maybe ideally has some pull and work with them, work on convincing them of the value of what you're suggesting. And then another simple thought is that People may assume that when an RFP comes out or they're responding to a proposal that you kind of regurgitate back with what the client has requested.
01:00:06
Speaker
and you don't have to do that. And they often say, please, if you have other ideas, add them. And they may not be taken and they may fall flat. But what can be helpful is to do that. i mean, it shows that you actually really understood and internalized the question.
01:00:19
Speaker
and that you're being creative, but to give them options to say like, here's option one, option two, option three, and maybe option four is what they asked for, but also just advocating for what you believe. Like we see where you're going with your particular approach that you've proposed, but we really think you should consider this, et cetera. So not being scared to push back or use your own experience and expertise and elevate that and amplify it and say, this is how we think you should approach the challenge.
01:00:44
Speaker
And then another thing was just around AI is to talk to your peers. I've learned many tools from you, Akshay, that I don't feel like maybe we're seeing as much in domestic circles here in America in terms of the qualitative researchers, but asking your peers on your social networks, on your Slack channels of professional associations, you know, what tools are you guys using and for what? And what are you finding successful and useful? And what should we be wary of? So doing some crowdsourcing around how tools are evolving, i you know, really just recognizing that you are a subject matter or a methodological expert in this instance, and to feel confident in that when you're speaking with someone who perhaps is a non-researcher to underscore the value of what we do.
01:01:25
Speaker
It can be hard, a hard sell, but ah certainly a worthwhile one. Yeah. Well, just maintaining that open mindedness. I think a lot of times people are so terrified of failure that they will just hang on go down with the ship of whatever hypothesis that they approached a research project with.

Reframing Failure and Navigating Unknowns

01:01:44
Speaker
So just being willing to yeah be open minded and say like, hey, I'm a smart person and this was my idea and it fell flat. Yeah. The F word is not something to be afraid of.
01:01:55
Speaker
It's something that is just reality. So reframing it not as a failure, but as a learning opportunity and a growth opportunity, because ultimately, what are you trying to achieve? Not proving yourself right. You're trying to solve a problem and solving that problem. It takes the most. Yeah.
01:02:10
Speaker
yeah I think you spoke to that quite nicely about the fact that it requires a certain level of vulnerability to invite collaboration. But it's also about saying that, hey, yeah I was out here to learn. That's the point of research, which means that the hypothesis or the idea or the product ah idea that I started out with may be proven wrong.
01:02:34
Speaker
And I think that's the big kind of thing I'm taking away from this conversation. i think I think there are these trends we talked about. But I feel like what you all did today in this last hour or so that we've spoken is really peeled back.
01:02:50
Speaker
the layers and even as a glimpse behind the curtain around what it takes to run a practice like this in 2025 and what that journey has been like over the last two years and just sitting here five months into this year what the future may hold and I feel there is sort of equal parts trepidation hope and vulnerability to say that we don't know what the future brings so but we're going to keep learning and i feel like the lessons that both of you all have spoken about really you embodied that but right through this conversation and that's something that i'm going to take away from this conversation so thank you so much for making the time for this podcast and Yeah, I'm excited to see where Noodle Research Plus strategy goes in the future and to collaborate.
01:03:44
Speaker
Yeah, yeah. Absolutely. Oh, stole words right out my mouth. Thank you so much for the opportunity to share our story and to be a part of what you guys are building, because I think the more bridges that we build, the more connections that we make.
01:03:56
Speaker
the greater it's going to be for everybody. So shifting beyond singular objectives and needs and thinking about collective goals and and objectives is going to benefit everybody. So thank you guys for what you're ah up to. And yeah, it's it's great chatting with you.
01:04:10
Speaker
Yeah. Thank you for asking. It's been a pleasure chatting and it's always fun to talk shop. So it's nice to do so with with our peers. but Thanks. And that's the episode.
01:04:23
Speaker
I want to leave you with a question. What's a decision you're making that might just change if you had the story behind the numbers? Give it some thought. Anyways, thank you for listening and do check out the other episodes in the series.
01:04:37
Speaker
Our first season focuses on qualitative data and research and we have a fantastic lineup of guests. If you found this valuable, please subscribe and share it with a friend who might find our conversation interesting.
01:04:50
Speaker
If you'd like to learn more, you can visit our website getdots.in or write to us at hello at uloilabs.in and follow us on LinkedIn for updates.
01:05:00
Speaker
Until next time, I'm Akshay signing off.