AI in Marketing by 2026
00:00:10
Speaker
It is 2026 and every marketing team is at least talking about AI at this point. Like they'd have to be, i don't know, in a nonprofit higher ed marketing team to maybe not be talking about AI. I'm trying to think of like the most far behind industry there is. But even those people are talking about AI when it impacts marketing right now.
00:00:32
Speaker
If your team isn't already playing with AI, isn't already using their own chat GPT account, it's one of those things where every marketing leader is thinking about how to implement it.
Equipping Teams with AI: Insights from Courtney Baker
00:00:41
Speaker
So that's what we're going to be talking about today is how to actually help your team adopt AI wisely.
00:00:48
Speaker
And I'm here with Courtney Baker, who's I think is her third, maybe fourth time on the show. I don't know. You've been on here a lot of times, but it's been a while. It has been a while. Courtney, it's good to have you again. i know You are the CMO of KnownWell, which is a tech company that focuses on the service industry and helping them scale their businesses. But today, I'm looking forward to having a conversation about what it takes for marketing leaders to actually equip their teams with AI and doing it well.
00:01:16
Speaker
So let's dive into the
Integrating AI into Workflows
00:01:17
Speaker
first question. It is, What are the most common ways ai adoption goes wrong with marketing teams? Because everybody's tried it. yeah Many have failed and say, blame it on AI.
00:01:30
Speaker
So I'm wondering what are the most common things that you're seeing across other marketing leaders implementing these kinds of tools out in the space? Yeah, I would say right now, i think I see still a lot of people stuck in experimentation for like experimentation sake. And I think there was a season where that was awesome. It was great. And everybody should have been experimenting. But I think we've moved past that phase where your team is going to get tired of just experimenting. What we need to focus on now is taking problems,
00:02:06
Speaker
documenting what those problems are, and then saying, can we use AI to solve this problem? I would say that's one area. I would say the second area that I see AI adoption fail is not bringing the AI to your work streams. and For example,
00:02:25
Speaker
you know, with people I always ask like, Hey, what's your meeting cadence? Like what drives your marketing team forward? And then how are you taking ai and applying it? Um, it could be a lot of different things. It could be AI driven intelligence for your pipelines, for example, how are you bringing those into those meetings? So you're, you're driving the action, the cadence of your operations with AI and instead it it stays totally outside of the natural workflows of your
Moving Beyond AI Experimentation
00:03:00
Speaker
marketing team. And so then it's just, you know, remembering to go use those AI tools or, you know, like trying to build new individual habits. But when you build it into the the fabric of your marketing organization. it's going to make it much easier for your team to adopt AI and for it become just like a default that your team uses. So those are two areas and where I've seen it go wrong lately.
00:03:31
Speaker
It's funny. I think I heard saw Ethan Mullick post recently of the best practices from like a year ago or two years ago of like, set up your team, give them some tools, give them some time to experiment. Like people are now doing this. Most companies are now in the experimentation phase and we're all looking at it. At least those who are like in it all the time would be like, yeah, you're behind on this one. Yeah. Totally. Experimentation's not enough. you had you If you missed that period, you now have to catch up because you were now behind about a year, maybe two years. i don't know. It really depends on which industry you're in. in And listen, i I do get it because some of those experiments failed or were hard. Listen, I remember the first time I was trying to build an agent in MAKE.
00:04:15
Speaker
You know, I was like, this is terrible. hard. This is, you know, and I obviously work with a lot of very smart engineers and developers and they were just doing things that i dream about doing. But, you know, it it i think when you get into these experiments and they don't work, you know, you're less inclined to do it again. And and so just if you get stuck in that phase, I think it's always helpful just to go back to the problem, find one big problem that you can solve and using AI. And I think that that can get you back on track.
00:04:53
Speaker
Yeah. I remember at the beginning, even social media was like experimentation, right? remember like in the early days of social media, people are like, i don't want to post what I ate for lunch. right You're like, well, it kind of took us a while to this to define the best practices right of what became standard 101 content marketing, social media, posting things that are helpful, posting things that are maybe resonating, posting things that are maybe newsjacking if you want to start to like get follow the trends. like All these things became standard. They took a while. Some came from other industries like the news cycle. Others had to just be inventive.
00:05:32
Speaker
But like we are just getting part of pads. I feel like we're just getting past the early phase of that phase of AI where we don't have any idea what we're doing. And ah now we have some like good parts in place. So I have to ask you, like, what does success look like?
00:05:46
Speaker
If they're now doing it, what are the teams that are doing well now doing? How do you know has really become part of a team's day to day work?
Organic AI Adoption in Teams
00:05:54
Speaker
Yeah, I think it's really a lot of not having to drive that change with leadership. When that is happening organically within your team, when your team starts to come to you with ideas or even, hey, here's this AI platform that I, you know,
00:06:14
Speaker
I want to use, here's the use case, here's the outcomes that I'm looking for and versus you having to say, hey, guys, let's let's get this problem. Let's get this team. Let's drive this action. and I think that's when you know, hey, this has shifted. and And then I would even go back to what I said earlier with your meetings when the intelligence that you're using to run your marketing, and when you start to have AI involved in those cadences, I think you can start to say like, hey, we have made that shift from experimentation to really utilizing AI to drive our marketing function.
Efficiency Gains with Custom GPTs
00:06:55
Speaker
I feel like one of the big things that we've run into is this custom GPT thing. That's kind of been tried and true. Still not a lot of people using it or using the instructions and projects or using cloud projects or using Gemini Gems, all the same thing generally, yeah um just different branded versions.
00:07:11
Speaker
But I almost feel like I could probably go to a team member and be like, what's the most normal thing you do that is time consuming? Like thinking and you have a process for it. Maybe you don't even understand what the process is, but let's figure out how AI can help at least speed up something you do daily or weekly.
00:07:25
Speaker
and see what we can do there. And then trying to do that with each specialty of your team, there's probably something there that's better than experimentation because it's least being coming back to like what they actually do rather than trying to invent new stuff that they don't do, which is usually a waste of time is what experimentation leads to is it invents new work that is now busier.
00:07:47
Speaker
Yeah, it's true. And especially when it's, you know, a busy season and then it doesn't work. And then it just feels like a waste of time. and I actually said this on our podcast. One of my hot takes was that AI would actually We would spend more time using AI last year than save time and because of that reason. It was so many AI tools. We don't have front runners yet for those things. you know it's It's exactly what you just said about social media. you know Over time, we kind of figured out
00:08:23
Speaker
What works? What are the plays? How do you do this? and Which tools do you use? You just knew like, this is the best tool. I'm going to use that best tool. Right now it's like the wild west where it's like, okay, i want to do this thing, but I could do that thing with like 12 different options. You know, it's not an obvious, this is the best tool to use for XYZ. And so we ended up spending more time. I do think it's getting a little bit, do you think it's getting a little bit better?
00:08:51
Speaker
Yes, absolutely. and By the way, on the custom GPT, I'm absolutely with you. I'm still shocked how many people... one Some of my favorite things about AI are these custom GPTs.
00:09:08
Speaker
I mean, literally save me so much time. I don't think our CEO would mind me saying this. But for example, I have a custom GPT in his voice. So simple. Like it's a simple thing.
00:09:20
Speaker
But it saves both him and me a ton of time when I utilize that. We have one for lead qualification. you know We want to know more about the lead. We have it built. We drop it in. We know everything we want to know about that lead. Again, very simple things that just took some time to do the thought work and every single day produce returns for me. So yeah, PSA for everybody.
00:09:48
Speaker
Yeah. And I always tell people like until you've had like a, until you've built like a dozen custom GPTs, like don't worry about make, don't worry about NAN, don't worry about vibe coding anything. It's like, and just to make everybody feel better. If you've tried make or NAN and failed at it or felt like you were dumb, I know developers that are extremely intelligent that say they'd rather just hand code it than use make.
00:10:11
Speaker
Okay. That's how confusing those tools are. That does make me feel better. but Yes. Thank you for that. i will take that one to heart myself. Oh, I should just be able to drag and drop the menu. And then you get the menu options. You're like, freaking, I don't even understand what it's saying. Yeah. And it's especially hard when it's like, it's a platform when you're using make with like something like HubSpot that I feel like pretty good. You you know, like a decent user of, and I'm like, I'm completely lost here. Yeah.
00:10:40
Speaker
It makes you realize how much work platforms like HubSpot and HighLevel or the marketing automation systems have dumbed it down for us marketers to be like, yes or no, left or right. good I don't know. Like they made it simple enough for us to be like, go in there and actually make choices versus like, now you want a Boolean search for You're like, what the heck is he talking about? Yeah.
00:11:02
Speaker
What's IP address? You're like, where to there's an IP address for this? Yeah, exactly. But you know, I mean, it's going to come. You know, I actually said this, I was with a company last week, they were working on some AI things that they were doing in the organization. And this wasn't necessarily tied to my role at Known Well.
The Future of Automation Platforms
00:11:21
Speaker
we were talking about whether or not they, how much they needed to add about agents in their policy. And I was like, you know, right now it's it's really not. It's like what you said, like to the common person, it's challenging.
00:11:34
Speaker
I'm glad that we can all establish that transparently. It's not the easiest, but it will it could come tomorrow, Dan. We could have some kind of platform that does make it easy enough yeah for me to just be like pulling the things in like it seems like it should be. and the whole ballgame changes.
00:11:55
Speaker
I think this is my prediction, but platforms like NAN and make will just go away and we'll just vibe code them because we'll be like, hey, when every time this application, here's the password for it or whatever it needs. Every time this happens, I want these things to happen and it'll go and be like, OK, and then it'll make maybe even make some advisements and be like, well, what when you get to this point, does it want to go left or right? You know, you're like, well, left.
00:12:17
Speaker
you know, and I think we'll just vibe code the automations. I do love doing that. I really listen with our team. Obviously, again, we have data scientists and, you know, our whole engineering team. I like to joke that I am a data scientist. and Because if you really, you know, just conditional logic and, you know, marketers do that, you know, i'm like I'm basically a data scientist. But do love when you can code like a calculator that, no, have my stuff actually ever seen the light of day? No, but I've been able to like hand those things off to developers and they're much further down the road than they would have been pre me being able to do that.
00:13:01
Speaker
So I have a question. My next question, i'm going to tweak a little bit and I'm going to ask it a different ways. I think this will be interesting, but you tell me there's two different ways you can answer this, but i'm going to twist it. It's I originally had, if a marketing, if a marketing leader has 90 days, what's the plan? What should they do in the first two weeks, the middle and the final stretch as if you were the marketing leader. But I'd like to ask you if since you're a CMO and if you hired a marketing director,
00:13:27
Speaker
who was pretty good with AI, what would your actual expectations be as a marketing leader ah for where like maybe an AI driven marketing leader could be in like 90 days?
Encouraging AI Experimentation
00:13:37
Speaker
Like what would your hopes be beginning, middle and end?
00:13:41
Speaker
Yeah, it's really interesting. and I'm going to go with the second one. Okay. I may intertwine the two, but i mean I'm going to answer the second one yeah because what I really care about is outcomes.
00:13:55
Speaker
Period. The end. I want a marketing director that can produce the results that we've outlined as our goals for the company, for their role, for the department.
00:14:08
Speaker
And at the end of the day, the reality is actually don't care. And I would say this is true for a lot of executives. I actually don't care if they use AI.
00:14:22
Speaker
Now, do I think that they could produce better results? Outcomes utilizing AI. I do think that. But the the end of the day, what I really care about is the outcomes that we've set out to achieve.
00:14:37
Speaker
And so one, I think that's always helpful framing Now, if we're not hitting our outcomes, maybe one of the questions I'm asking is, well, how are you utilizing the resources, the tools, assets at your disposal?
00:14:53
Speaker
Oh, you haven't used AI at all? Okay, well, that we why haven't we pulled that lever? And so I think that is kind of the series that I'm you know dissecting against. But at the top level, ultimately,
00:15:08
Speaker
If you can produce stellar results without using AI because you don't need to, it does not matter to me. Again, I don't think that's the world we're necessarily living in today. Yeah.
00:15:20
Speaker
And then I will actually answer your other question too. What would you With the directors, yeah. with my team, I want them to really set an environment that allows experimentation.
00:15:38
Speaker
that And what I really mean by that, and I actually think this is true for marketing teams at large, is that we have to lower the fear.
00:15:49
Speaker
If marketing teams, I believe, are in a state of fear, you are not going to produce great marketing. That's my personal belief. And so you have to create an environment where marketing teams can take risk.
00:16:06
Speaker
Those risks sometimes will pay out tremendously for you. There will be times your marketing team, they'll fail. Matter of fact, Dan, i just we just did something on our team. We ran a new ad campaign with a newsletter.
00:16:22
Speaker
we took a small bet on it. It didn't work. you know We failed at it. But I am in an environment where we can take those kind of experimentations and everything is great. So I would say first thing out of the gate when you're rolling out AIs, you've got to set the tone that you know this is not...
00:16:44
Speaker
This is, we are doing this together. There's no fear in this. We're we're we're creating an environment where we can take risks. We can try things without judgment. And then you move into, and I'm a broken record here, a Dad. You got to start with where the problems are. You got to use real work. This is yeah we're moving to outcomes, not playing around with new tools.
00:17:11
Speaker
remember two years ago, i was talking to one of my guests named Bart Kaler. And I think we both had this epiphany that like, we were using ai but it really wasn't helping any of our core stuff move faster or better. It was really adding on new stuff that we could do that we're like, oh, we've always wanted to add on these extra things.
00:17:30
Speaker
You what saying? Like it just added extra stuff to do. It was faster and it was stuff we wouldn't wanted to do. But now I think it is getting good enough that actually actually starting to chip into the things that we do regularly. And I think you're right. Always aligning it back to like, does this create value for the customer? Is this actually helping us do better marketing? Is this actually helping us do the core thing? or helping us move to the objective. Otherwise it's easy to get lost in that. And so you keep bringing it back to that point, but I'm like, ah but it needs to be brought back to, because we can all have a little bit of FOMO, a little bit of Ooh, shiny going on and just keep chasing, chasing the next cool thing. Yeah. Kind of like, i yeah I know we were make them, but do you really need them? Yeah.
00:18:07
Speaker
I know we were joking about, you know, me coding something for our development team. actually think in marketing teams, that's one of the best use cases, like even with design yeah of being able to collaborate so much quicker. i always say marketing is a team
AI's Role in Team Collaboration
00:18:23
Speaker
sport. Well, you know, if it is a team sport and I can hand off the football faster, you know, knowing what the play that needs to be run is, that's incredibly helpful for us producing an outcome faster and scoring a goal.
00:18:39
Speaker
I don't know why I'm making this football analogy other than I've been watching too much football, Dan. Yeah. It's that season we're getting close. So yeah have you run into a situation where you've hit pushback from team members over AI or maybe from other vendors or people that you're working with around job loss, quality worries, or just just extra work?
Understanding AI's Limitations
00:19:01
Speaker
Yeah, I would say the place that I see it the most is... People want AI intelligence or signaling to be 100% accurate.
00:19:16
Speaker
And listen, we all want accurate, but the reality is... they're they don't really understand what AI is at its core or and what's different about it than say, you know, SaaS, you know, what the output that my CRM with like manual entry data, deterministic data is, and then what AI produces and how those things are different. And so it, in some ways, I think, The pushback is understandable, but I think if you understand the core of what your platforms or your tools are based on it helps you understand, oh, I'm not going to get 100% signal or prediction
00:20:08
Speaker
every time And so I have to change how I think about this technology and how I utilize this technology. And so, you know, i this is a very business operations type answer, but that's actually where I see a lot of pushback is not really understanding AI well. you know I think the if you're not deep in this, it's easy to look at it just like deterministic software, which it is not.
00:20:40
Speaker
And you and I know that and probably everybody listening, but that is where I see the rub a lot of time. And I think it causes some of that pushback. And again, I understand the pushback, but I have this deep desire of If we can educate more on how AI is different than a lot of the technology that we've used previously in our roles, I think it helps us get past that roadblocker. Do you ever see that? I'm just curious.
00:21:08
Speaker
All the time. All the time. And it's funny because I'm like, if we actually made it more deterministic, it would suck. Like it would lose all of its actual creative ability. Yeah. um And now with the reasoning models, they're so freaking good that like I'm i'm doing some market research for a company to refresh on like ah even making a standard document of voice a customer.
00:21:26
Speaker
I'm just trying to find out like, well, how do customers talk about this brand? Yeah, it's actually research I did a year and a half ago and I was using AI then. But now that was before reasoning models. That was before deep research.
00:21:39
Speaker
That was before it was able to actually like run through tons of information and summarize it well. So now it's totally different. I remember now I'm comparing it to market research I did a year and a half ago and it's better because it can go scour the whole internet for every single testimonial and find the exact language. And I know because I'm like reading this person's books that I'm like, oh yeah, they this this is the voice of the customer because I can hear the founders leading in it, but I could still hear the individual ones. And of course, I always write my deep research prompts in such a I'd be like, show me with links. Yeah. Yeah.
00:22:15
Speaker
i don't want it to it can it can make stuff up but if you always write your deep research or your thinking prompts with ah like hey and show show summarize your work like show me the actual quotes from the customers with the link of where they said it so i can go check to see if that's what they actually said and spot check it then it's moving along at a much higher quality, but you do have to know what it can and can't do.
Balancing Data and Intuition
00:22:41
Speaker
yeah You have to know like, Hey, it's still actually, i asked it to give me an acrostic for the word pray. And it's like, Oh yeah, but pray only has five, pray has five letters in it. And you're trying to get it down to four. And I'm like, no, 5.2 thinking you are wrong Yeah. Like, how did you get that?
00:23:01
Speaker
i am curious. I have a question for you. So I do think when we like at an individual level, you know, we're using GPT, we never, you know, It's easy like in a big research project like that to get lost in, oh my gosh, is this actually right or wrong? you know And like evaluating. But I think in general, when we kind of see the responses, we know that it could be wrong. you know But I think when we take the leap from Gemini or ChatGPT into ai native technology,
00:23:38
Speaker
business platforms or our tools, we kind of lose that lens in a sense, it's like our expectation of chat GBT being 100% accurate, we don't have.
00:23:54
Speaker
But once we move to enterprise, we put this extra layer of it needs to be 100% right all the time. It's almost, Dan, I mean, I'm not kidding. i have been in many executive team meetings where the CFO brings in you know, the data the data from, you know, the last, you know, round of financials. And it just becomes a debate about if the numbers are right or wrong.
00:24:22
Speaker
And so I wonder if we just like to not actually deal with issues and just debate, if data is right or wrong. So maybe it's just a business thing, but it's it's interesting with ChatTQT, we don't have such a stringent, we know, we're like, this isn't, but when we move into enterprise, we kind of lose that lens. I'm just curious if you have an idea why we don't hold that as we move into other AI tools.
00:24:51
Speaker
I mean, I think marketers have been trained to be more data-driven, To the point where it just became stupid. Yeah. i don't know what to say other than like we became so data driven that we couldn't actually like have this thing called discernment anymore.
00:25:08
Speaker
It's like the numbers needed to prove. I'm like, if the numbers could prove everything, nobody, like there would be no art in this. And there's tons of art and guessing and discernment and gut feeling when it comes to all kinds of business things. Do we want to look at the data? Of course, to inform our gut, to inform our discernment and decision-making.
00:25:26
Speaker
ah Isn't that why you hired someone with us who's a senior level? Because they didn't come with the data, they came with the discernment. And that's still what drives things with AI. yeah In fact, that's still what makes someone better at AI. It's usually that taste and that discernment. Cause I can look at something and know if it's a good marketing plan or not for a specific company.
00:25:43
Speaker
I can discern that with chat. chipep Do you ever get a response from chat GPT? And you're like, no, nope, that wasn't it. And you realize you're like, going to go reef. I'm going to refine that prompt. I'm going go hit that edit button. I'm to refine it because that wasn't it. But sometimes you, you enter it and you're like, Whoa, you're like, that was a plus plus. I wasn't expecting it to be that good chat digital high five. And it's like,
00:26:05
Speaker
You're welcome. You know, but that's the discernment and that's why it's so important. And I think that plays a bigger role in AI because AI can handle with the data better, honestly, than we can a lot of times.
Innovation with Minimal Guardrails
00:26:15
Speaker
But I be's i think that's so good. That's so wise. Yeah. We have become so data driven, but it's how much data do we need to, to,
00:26:25
Speaker
no, we got to go fix a problem or there's a, you know, there's something that we got to go make action. Do we have to get it to 100%? No, probably not. Yeah. I mean, even look at the, uh,
00:26:37
Speaker
I want to say, I can't remember which book this is, but like some of the basic financial books, like The Millionaire Next Door. I don't know if it was that one, but there's like some of those those really basic like yeah entry level finance books. remember even hearing these guys about day trading and stuff. And they're like, really? Like sometimes you just have to know a lot about the industry and kind of be in it. But then like just hear normal people talk about it.
00:26:57
Speaker
If you're hearing normal people talk about it's probably a bigger deal than you think. And maybe you're catching it before all the data driven people who is day trading over in Manhattan. Yeah. Are actually picking up on it because you're picking up on a level of discernment that the numbers aren't showing yet. Yeah. Because by the time the numbers show it and have proof, well, the stocks already up high.
00:27:17
Speaker
Yeah. So that's the most data driven industry of like all time. Right. But like still- the people who know how to play the game are using discernment and experience in order to make smart decisions on it. I think it's the same thing with marketing. And we're but we've been so data heavy because the market, the market tech companies have pushed it so hard and done a good job marketing it and we'll probably swing back the other way now. and yeah That's my guess.
00:27:39
Speaker
Yeah. So good. um So with all teams, everybody's got AI as and like an agenda item this year. like We all have it. It's going to be a thing. Every single team. What are some of the basic rules ah that you see teams need to apply early on? Like what kind of guardrails around brand, privacy, approvals do you see helping teams, but also slowing them down?
00:28:06
Speaker
Yeah, it's a really good question. i mean, I think slowing them down is what I talked about earlier of just i you know It is that FOMO. There are all these new tools, new gadgets, new trinkets, and it's really focusing on outcomes, real problems, like things you've got to solve and or things that you're just like, we could do this a lot faster.
00:28:31
Speaker
let's utilize AI to do it. I would recommend keeping the guardrails as low as you can, especially to produce, let's say, V1 of something. you know you may You may open the guardrails completely for hey, to get to a V1, go to town.
00:28:53
Speaker
Before we publish something, we may need to approve it. and But really allowing the team to... Have some room to move. and i think, of course, we've got to be careful if you're in certain regulation you know regulatory environments, the bar is much higher. i would say run it as much like a startup as you can. And as someone in a startup, I will tell you the guardrails are low. And I would say make sure that you practice what you preach. So if you're wanting to have your team bubble up things, find things, find areas that they can utilize AI for, you should be doing that as well. So if you want them to be, and you know, kind of that environment of not afraid to, to bring a new tool or try a new custom GP team, you should be doing that and bringing it back to your team.
00:29:51
Speaker
Um, I would also say when it comes to guardrails, hopefully you have some kind of corporate structure that you can just live within and then lower the guardrails for your team to just work within your corporate structure and and move move quickly. and Again, lower the guardrails, especially for things that you're producing, you're not publishing externally. Yeah.
00:30:21
Speaker
remember getting a word of advice from a a senior executive once and he's like, never apply big company rules to small companies. No. Sometimes you think it's wisdom. You're like, we should have pass we should all have password managers and password protections and really strong encrypted passwords. You're like, we're a team of five. We don't need that. Yeah.
00:30:40
Speaker
yeah I mean, it's just like, you know, I've worked in much larger companies where I've had much more extensive brand guides and even those brand guides are not appropriate for our startup stage. It's too much. It's too high of a guardrail. and and And I would say you know that that is a good analogy for what we're talking about with AI to put appropriate size guardrails for the appropriate size company and regulations that you're working within. And I would default to the lowest guardrails that you can.
00:31:16
Speaker
and I also would say if your company has not gotten enterprise, chat GBT, if you're using Google Suite, if you're not using the Gemini enterprise, that is a huge unlock and allows those guardrails to be lowered and because you're working within your own protected ecosystem. Yeah.
00:31:40
Speaker
Dan, I'm sure you're a fan of that, but i I'm telling you you that was such an unlock for our team. And we're an AI company. Like if you're using Gmail and you don't trust Gemini, something's wrong. Right. like, guys, all your data is already in Google Drive. Like at least give your employees access to use it in Gemini. Because like if you don't trust Google with Gemini, then why are you using Google Drive?
00:32:03
Speaker
I mean, it essentially becomes your knowledge management. The best everywhere I've ever seen their Google Drive is a mess. It is now, it is like a secret weapon for actually finding the knowledge your team needs in your goal. Gemini now and in baked into drive. have you tried it yet? It's amazing. Oh yeah. It it is.
00:32:24
Speaker
It's like what search should have been a long time ago, but it can do stuff now. oh Yeah. No, no. This is like one of those things that you're like, oh, finally, you know, yeah finally. Yeah. I'm still waiting that for my Alexa, just so you know. and It's still all not happened yet. So just keeping it out there, Alexa. I'm watching.
00:32:45
Speaker
Well, I did have a chance to meet with some enterprise companies at a conference a couple of months ago, and I learned something really, there was there was upside, but a lot of downsides to working at the really big enterprise companies is that they don't they're not even allowed to use Gemini. They're not in Google. They're not in Outlook.
00:33:01
Speaker
They're not only using any AI except for some super secure privacy focused in-house enterprise AI that's like a year and a half behind what current models are. Yeah. That was a lot. A lot of the enterprise companies are stuck with that.
00:33:15
Speaker
But they also have bigger budgets to be able to afford things ah that can do some really impressive AI things at scale. So there's that too. so but Yeah.
AI for Client Health Tracking
00:33:24
Speaker
As someone that sells an an enterprise AI native platform, i would be glad to talk to them about bringing in intelligence on their clients. Yeah.
00:33:33
Speaker
That's it. That's the shameless. you've got budget like that's you can afford some of the coolest AI tools out there that are like out of reach for all the small companies. We're all using chat GPT, which is fantastic and is an advantage of it itself because we try things faster. But there are some really cool tools kind of like what you guys are working on. Yeah, we're using chat GPT because we literally don't have the people you have to do the thing.
00:33:56
Speaker
So tell us a little bit more about Known Well and even your podcast over at AI Know How. So I think a lot of people from this show would love to take a listen there too. Yeah. Known Well, we created specifically for B2B services firms. We talked to so many firms, Dan.
00:34:13
Speaker
They were all, it didn't matter if they were small or an enterprise company, they were all tracking the health of their clients on a spreadsheet. a red, yellow, green, your team would come in.
00:34:26
Speaker
And I was always shocked. I would think, no, this company, they're so successful. They figured this out. um And we realized we are in a new era with AI that we could actually track this now utilizing AI and have...
00:34:41
Speaker
platform where on Monday morning, you're not having to come in and call all the people or check the spreadsheet. You just know immediately, real time, the objective health of your clients, your entire portfolio. And so that's what we've built at Nunewell. We're extremely excited and proud of it. and We'd love to give anybody listening a look at it. and I know for all the marketers listening, The top of the funnel is always the emphasis. But if you got a leaky bucket, and we always hate that leaky but that leaky bucket on the marketing side, it's just hard. It's just more pressure for sales and marketing because the answer is always just pour more water in. There's nothing more painful than crushing it in marketing and then sales is doing well and you're like, yes, we crushed it. And then like the churn's so high that you're like, and you need to crush it double now because we can't hold on to customers. You're like,
00:35:36
Speaker
Exactly. And I hate myself. Exactly. Well, and it's, you know, Dan, it kind of comes back to what we talked about earlier. Like we have been drilled on the data, data, data, data. data We took cells, these salespeople.
00:35:51
Speaker
15 years ago, they just went out and did their thing. They were like lone wolves. They would come back. They sold something that your company doesn't even make. you know it just was like It was all about relationships, getting on the golf course. That all changed.
00:36:06
Speaker
We changed that system from being about like relationships and like vibing to data Now for professional service firms and B2B SaaS companies, that same trend transition has to happen on the post-sell side of things. It can't just be about vibes. you know can't be just about how we feel like the relationship is going. It has to be driven from data, from intelligence, to know where do we actually stand. And and so I'm really excited about that
Enhancing Customer Lifecycle with AI
00:36:39
Speaker
transition. That can pull all the way through the entire customer lifecycle and hopefully take a little pressure off us sales and marketing people.
00:36:50
Speaker
I think there's probably a whole conversation just around how AI can take the qualitative and turn it into quantitative. yeah And not so that we you know become so data-driven that we can't make a decision without numbers, but so we can actually be properly informed, hold the right systems accountable or the right people accountable. I think that's what it's all about. Yeah, absolutely.
00:37:11
Speaker
Thank you so much for joining me again on the AI-Driven Marketer. yeah Thanks for having me.