Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
The Future of Content Intelligence With Adology | James Donner image

The Future of Content Intelligence With Adology | James Donner

S1 E35 ยท The Efficient Spend Podcast
Avatar
22 Plays11 days ago

SUBSCRIBE TO LEARN FROM PAID MARKETING EXPERTS ๐Ÿ””

The Efficient Spend Podcast helps start-ups turn media spend into revenue. Learn how the world's top marketers manage their media mix to drive growth!

In this episode of The Efficient Spend Podcast, James Donner, founder of Adology, shares how AI is transforming creative analytics and competitive intelligence in marketing. James explains why unlocking insights from ad creatives is the next frontier, how to balance automation with strategy, and where AI still breaks down. He also highlights opportunities in white space detection, consumer sentiment analysis, and building smarter workflows for the future of marketing.

About the Host: Paul is a paid marketing leader with 7+ years of experience optimizing marketing spend at venture-backed startups. He's driven $250M + in revenue through paid media and is passionate about helping startups deploy marketing dollars to drive growth.

About the Guest: James Donner is the founder and CEO of Adology, an AI-driven creative analytics and marketing intelligence company. With over a decade leading media investment at Media.Monks and helping build Decoded Advertising into a 200-person agency, he specializes in unlocking growth by combining performance marketing expertise with cutting-edge AI insights.

VISIT OUR WEBSITE: https://www.efficientspend.com/

CONNECT WITH PAUL: https://www.linkedin.com/in/paulkovalski/

CONNECT WITH JAMES: https://www.linkedin.com/in/james-donner-8430b555/

EPISODE LINKS:

https://www.getadology.ai/about
https://www.monks.com/what-we-do
https://mailchimp.com/resources/ai-predictions/?
https://squareholes.com/blog/2021/04/29/how-the-creative-and-creativity-can-help-drive-your-advertising-effectiveness/

Recommended
Transcript

LLMs revolutionizing video and image analysis

00:00:00
Speaker
What happened about a year ago was that LLMs became better than humans or as good as people at watching videos and watching imagery and seeing what's in them, describing the attributes, but also the context. So really rich descriptions, not just labels, because computer vision in advertising prior had been heavily based around labels. And so seeing what that could do and kind of just realizing and advertising all of the data is locked up in imagery. I mean, there's tons of media data. Myself my teams would analyze you know that till the sun came up, but there's so much more data in creative.

Founding of Adology and creative analytics

00:00:42
Speaker
James, welcome to the show. Thanks for having me Paul. I'm excited to chat all things creative and AI with you today. i was hoping we could kick this conversation off just by giving some context into the Adology founding story.
00:00:56
Speaker
I think it's pretty fascinating how quickly so many startups have popped up in this space. And I know that this is a recent thing for you. So can you kind of take me back to the point of, you know, you identifying the the need in the market to build something like this, your kind of motivation behind doing so and how it all got started? Yeah, yeah, for sure. So yeah, I'm not a tech person by trade, but but somehow got the idea in my head to start a tech company, which probably isn't so uncommon. So prior to this, I was helping to run an agency called Decoded Advertising. i was one of the founding partners in and charge of media optimization, media buying. So the teams that would spend, you know, hundreds millions of dollars a year on marketing for clients like Estee Lauder, Visa, QuickBooks, brands like those. So yeah, the genesis of the business was kind of like twofold. One was the, what we were doing at Decoded around creative analytics and seeing how powerful it was.
00:01:54
Speaker
And then what I was seeing in LLMs and what they were able to accomplish. And so while everyone's been very focused on using multimodal LLMs for creating content, automating content, first blogs and emails and then copywriting and then imagery and now videos,
00:02:14
Speaker
I've always been very fascinated with the computer vision side of LLMs and the fact they

Shift to LLMs for deeper insights

00:02:19
Speaker
can see. And so we were always kind of looking at, you know, what, when can we automate some creative analytics? And, you know, Google had their Google vision API where it could be like, there's a red car and there's a stop sign, there's a bicycle.
00:02:32
Speaker
It's all very literal. What happened about a year ago was that LLMs became better than humans or as good as people at watching videos and watching imagery and seeing what's in them, describing the attributes, but also the context. So really rich descriptions, not just labels, because computer vision in advertising prior had been heavily based around labels.
00:02:54
Speaker
And so seeing what that can do and kind of just realizing and advertising all of the data is locked up in imagery. I mean, there's tons of media data. I myself, my teams would analyze, you know, that till the sun came up, but there's so much more data and creative. And so we used to create more data. We used to add labels to content manually.
00:03:15
Speaker
So we'd have teams of people labeling every single ad that a client ran, and we'd produce dashboards that had these deeper creative analytics. But even then, even when it was only, you know, a few minutes of people's time amongst the creative process that still took tens of hours per asset, if not more, people didn't want to do it. It was inconsistent, all

Adology's focus on creative analysis

00:03:33
Speaker
these issues.
00:03:33
Speaker
And so when LLMs about a year ago were able to surpass what people could do faster at a higher scale, that was the moment like, oh, there's... millions of pieces of content out there that can now be analyzed and described. And you can build these huge data sets of what everyone is marketing, what everyone is saying, what consumers are doing. So as an analyst by trade, that was like the moment of, okay, I'm going to build a business that automates some of our best processes that we used to do at our agency.
00:04:02
Speaker
And one that unlocks this whole new world of data that prior wasn't really possible. It's really cool that you're taking a little bit of a different approach to what a lot of other folks are concentrating on, which is the creative production side. And you want to more operate as a creative analyst, ah a co-pilot, as you state.
00:04:24
Speaker
Why is that? Do you think that there's more kind of green space there? Yeah. I mean, there's, there's more green space. It's also, it's, it's unattended and it's kind of necessary.
00:04:37
Speaker
I think, you know, making things automatically is cool, but but it's also not perfect yet. ah People still aren't really running that many AI ads. Like the AI avatar space is pretty, is pretty hot if you want to run influencer ads.
00:04:51
Speaker
um But It still requires a lot of human input on top of it, but yeah, it's mostly, I think I just saw that people aren't really helping on the strategy side yet. There's a lot of competitive intelligence companies for media, which will report on your competitors as been data, other web traffic.
00:05:06
Speaker
Yeah. It's been data stuff is so incredibly inaccurate by the way. I mean, I used to audit. that in my last agency, we would pull what they thought all our clients ad spend was, and it would be off by 10%, 500% up down. So I don't put a lot of stake in that, but competitive intelligence is always really robust for media and didn't really exist for creative. You, maybe your agency would put together a little bit of a swipe file and you'd look at it.
00:05:30
Speaker
So it was just seeing the need you know the gap in the marketplace for that for that information, seeing the knowing that creative is the most important thing going forward. you know Media, you kind of have to get your settings right. but once your settings are right, kind of in a good place in terms of like tactical media buying.
00:05:45
Speaker
but But creative is just an endless game of optimizing, optimizing. And so it's pretty clear the intelligence there is, I think, the key lever.

Importance of market trends in ad performance

00:05:53
Speaker
in the future, knowing, you know, where the white space is, what the consumer pain points are, what the ad trends are to borrow.
00:06:01
Speaker
When we were, you know, at the agency, we weren't good at creative performance for a while until we did this project where he started aggregating what was working in the market. And so we started aggregating, everyone started sharing into a Facebook group, all the ads that they saw with all the likes, comments, and shares. And we started breaking down the tactics in them. And this was back in like 2017, 2018.
00:06:22
Speaker
And when we started breaking down those trending, high-performing tactics and using them in our ads, it wasn't guesswork anymore. we were adopting high-performing trends and our our performance marketing, performance creative, just started getting really, really good.
00:06:34
Speaker
And that was another like formative moment, like eight years ago ah for this business was, wow, the power of... Learning from other people is actually greater than even your ability to test and learn yourself.
00:06:46
Speaker
You, there's so little creative data in your own ad account. but There's so much when you start looking at the entire marketplace. Sure. I mean, I've went through the exercise several times of auditing competitor, ad libraries, competitor, creative.
00:07:00
Speaker
And a lot of times it's more of firming process for me of, okay, we're doing a lot of the same things and and this is working. One of the unique things about creative and specifically direct response and ad creative is there are some fundamental components of what makes it work, a strong hook that captures it attention, a good explainer, ah a good kind of call to action.

Elements of a strong creative ad

00:07:24
Speaker
There's elements that I think you can find in good performing, common ah common elements, right?
00:07:29
Speaker
But then what really differentiates creative, there's a lot of kind of constant change in what's hot and trending that you need to apply on top of those foundational elements.
00:07:44
Speaker
Meaning that, you know, if something's happening in the news, for example, the Coldplay concert, CEO, founder, whatever, being able to quickly like adopt your foundation framework for what works for your brand and the elements of a strong creative on top of that viral moment is like a recipe for a really good ad. And so that's kind of what I find interesting here is like, there's the foundational element, sure, and you can put that data and feed that into a system, produce good ads, but you also need the market data to kind of overlay on top that.
00:08:22
Speaker
Yeah. Yeah. There's like the fast moving data and then not so fast and then the slow moving data, all of which are important. You know, the data in your account is almost a little bit more slow moving because you're slowly building those learnings over time. But yeah, when it gets to organic content on TikTok, it's going to have to be based on super recent.
00:08:38
Speaker
activity. And if you want to automate content and you want to actually automate the strategy behind content, you need data sets, data sources, knowledge sets that actually contain what's going viral, what's trending right now.
00:08:52
Speaker
ah For news, that's a bit easier to get. For content advertising trends, it's a lot harder. That's where we like to help help

Comprehensive knowledge bases for AI marketing

00:08:59
Speaker
folks. But to your earlier question about you know why focusing on the on the analytics or insight side, it's because I think, yeah, if you want to automate the whole process of creative, which isn't the end goal, but is a sort of intermediary goal of automate big parts of it, save time, be able to scale it, leave the people to do the strategy and the decision making.
00:09:17
Speaker
You need automation for the creation, but you also need automation for helping drive the creation, especially if your aspirations are to do some seriously scaled AI driven work. You need a brain that's going to instruct the production company.
00:09:30
Speaker
Otherwise, it's kind of like you're hiring just the production, the AI production company, but not the AI you know strategists. let's Let's talk about the etology brain. So I'm going to kind of walk through my notes on my understanding of your technical process.
00:09:47
Speaker
And then I would love your ah your kind of like commentary on maybe some of the interesting things you're you're doing, what you might want to share. so essentially, and please correct me if I'm wrong, but you are ingesting ads at scale by you know scraping websites, looking at public ads, brand ad libraries, running multimodal AI tagging on visual visual audio text, building a data set, and then training trend detection models, and then feeding that into a promptable co-pilot. Is that a generally accurate description?
00:10:21
Speaker
Yes. When we build our data sets, it's a lot more than just training the trend detection. It's really building up knowledge bases for AI to then be able to either mine for insights such as trends or leverage for automations.
00:10:36
Speaker
So if you want to automate a new post every day, that's based on recent news, plus using the top performing hooks in your category, plus using the best performing ad concepts from your historical ad performance.
00:10:50
Speaker
You can do that with our knowledge bases because it gives AI intelligence across all three of those sources. Do you see more value in the kind of like automated organic content side or the maybe slower paid creative productions?
00:11:07
Speaker
I think they all need support. I think what all of this needs. Yeah. You know, we're, we're essentially a knowledge-based company. We believe we're building knowledge bases for kind of the future of, of marketing or AI driven marketing.
00:11:19
Speaker
And so whether you're doing rapid social content, you need that knowledge base. Otherwise you're having to prompt every single thing yourself that you want made. Or if you're doing ad content, you may use it a little bit less frequently, but you still need it if you want to quickly know where there's white space or quickly know what's trending in your category ah in terms of content approaches.
00:11:40
Speaker
What do you think are some of the more non-obvious areas of white space that a common marketer might not detect or maybe not identify that adology can help solve?

AI identifying market gaps

00:11:55
Speaker
um I think there's, I mean, there's so many. i So yeah when you when you look at ad libraries for trends and you're doing it yourself, you know, you look at one brand and it's like, okay, you can kind of see what they're doing. You look at another brand, you kind see what they're doing, but now you're only taking in 70% of it. Then you're on brand three and you're down to like 40% of what's going on and you've forgotten 50% of the first brand. So that the I think for people to really understand the scope of what everyone's saying and doing, it's just too much.
00:12:20
Speaker
It's okay when you're trying to pick up a few trends, but that's where AI really shines. There are so many weaknesses to AI, by the way, as a, as an AI company, to me, it's number one strength is its ability to summarize and synthesize and to see patterns, but it really is lacking and in knowledge in a lot of ways, which might sound crazy people.
00:12:38
Speaker
So yeah, I think can't remember. I actually just lost my original train of thought, but yeah, what was it? Sorry. I forgot out the get the original where I started with the question. Yeah, i've I think what what I'm kind of interested about is identifying white space.
00:12:53
Speaker
Yeah, but but also like we can we can leverage AI to track competitors at a much more efficient and automated rate.
00:13:06
Speaker
And then we can see, okay, this competitor is tending to use this type of concept or this type of value prop. And we have the information at our fingertips. And then the question is, how do you respond to that too, right? Like in my case, working for a credit building company, it's helpful for us to know that competitor A is kind of going at a 45 point credit score increase and that's where they're pushing. So we know we have to kind of respond to that. I'm kind of wondering your thoughts there too, like
00:13:38
Speaker
how are we using a company like Adology assess not only what our competitors are doing, but how we can respond to it? Yeah. I mean, some people want to copy more. Other people want to you know go towards the white space.
00:13:51
Speaker
I think and we don't necessarily have an opinion on which one is is better or worse. I can say when we... Things started to getting more powerful when started incorporating a lot more user sentiment and social listening data.
00:14:02
Speaker
So product reviews, questions on Quora, even Twitter comments, Reddit comments, and all of that. And think the more sources you start to ingest ingest and put in your knowledge base, the more likely you are to catch the things or have AI catch the things that you may have missed, which is consumers are complaining about this.
00:14:21
Speaker
And it's what they're complaining about the most. It's what they have the highest... passion and sentiment about, because we're extracting all the the sentiment and the depth of how much they care about these things.
00:14:33
Speaker
And then they're complaining about this. Here's where everyone is addressing these pain points or not across their retail sites, across their ads. Here's the gaps. Here's what's left. Here's the unattended things for you to go after.
00:14:46
Speaker
And so we're a relatively new business. you know We're watching our clients start to take advantage of these insights and build campaigns around them. And we're just starting now to even collect some that data of how much you know a campaign built around these insights helps.
00:14:58
Speaker
But I think you know there's a lot of little things that will fall through the cracks as far as things you realized you didn't you never realized your business wasn't addressing these. And either consumers are talking about them or your competitors are.
00:15:09
Speaker
or areas where you realize your competitors are leaving yet kind of empty space. But in turn, the applications almost, you know, can have almost nothing to do with the competition and can really just be about listening to consumers and then looking to make your content more effective by using better hooks or using better, you know, techniques.
00:15:26
Speaker
Is there anything in the in the building of your product that prioritizes certain data sets over others or values pieces of data over others?

Selective data inputs for quality AI outputs

00:15:37
Speaker
And i want there the thing that I'm kind of thinking about is like, we have you can you can kind of ingest the visual text audio of of a competitor ad and ad sets.
00:15:48
Speaker
There's the social listing, which you mentioned. There's reviews. There's you know answers on Reddit. There's all these different data sets. like How do we figure out, how to orchestrate all of that, and then what is more valuable and what's not.
00:16:02
Speaker
Yep. So there's kind of two ways to have AI produce higher quality predictions or recommendations. So right now, if you just go to ChaiGiBT and you ask it, you know, what apps should I make? It'll give you a bunch of random ideas. Those are not performance informed. ah They're probably not grounded in your category, all of that.
00:16:19
Speaker
If you feed in our knowledge base without any of the enrichments that we do, it will now know what everyone's doing and it'll be able to give your recommendations are grounded in the category and basically at least, you know, relevant to reality and what people are actually doing in their ads today.
00:16:33
Speaker
Then the level three is okay. Now we want to actually have a produce high quality recommendations. And so there's two ways that we do that. One is feeding in the right input data. So, okay. We're not going to give it every ad that people are running. We're only going give them people's most effective ads or our most effective ads historically.
00:16:50
Speaker
We're not going to give it every consumer comments. We're going to give it the ones that are the most frequently mentioned and have the highest sentiment and, you know, depth of, of feeling around them. And so giving better inputs to get better outputs.
00:17:02
Speaker
And then the second thing is on the outputs, having those outputs then re-ranked based on a more probabilistic model that will actually look at all the recommendations, analyze them against all the data that has, and then actually rank them on which ones we think are most likely to be effective for your business.
00:17:17
Speaker
So that's actually where we're at right now in the company in terms of building that. So right now our product will actually have all these signals on what's working and then we'll feed in the higher performing creatives and and insights and let it then ID produce concepts based on that. So you just talk to our AI agent and you say, but you know, I want new ideas based on my competitors, top performers are based on the top trends.
00:17:39
Speaker
Then the question is how do you actually identify what's, you know, top performers. In some places it's really easy. What has a lot of likes, what has a lot of shares on the organic content and the paid content.
00:17:50
Speaker
We have our own methodologies. We're essentially reverse engineering what marketers are doing. So, you know, a marketer who's running our Facebook campaigns, they're going to do a few things. One, they're going to pause the ads that don't work very well. And they're going to keep running the ads that do work well.
00:18:04
Speaker
So how long an ad is running is a signal of success. The second thing they're going to do is they're going to make more iterations of ads that work well. and they're not going to make iterations ads that don't work. And so we watch that as well.
00:18:16
Speaker
And that's a huge signal for us. So we're watching all of these signals and then we're enriching essentially the knowledge base so that AI that's accessing our knowledge base is able to go and select high performing ads or organic posts, et cetera.
00:18:29
Speaker
That's really great. I think, you know, I struggle with this too.

Creative testing and marketing strategy balance

00:18:33
Speaker
My philosophy on kind of like creative testing has shifted and changed over the years because sometimes I think to myself, I don't just want to continue to create a variation of the same winning ad, or I don't just want to say exactly what my competitors are saying in a different way because it's too vanilla and it's not different enough.
00:18:54
Speaker
And I want to be new and unique and sexy and whatever. But then it's like, well, it's working. So like, who cares, right? Like, if you have a winning ad that's been spending a lot for the last year, who cares? Like, you don't have to pause it, keep it going, create a simple variation.
00:19:12
Speaker
You know, where do you kind of stand on that? I mean, I agree. i' remember we'd have clients that were running ads for like two or three years and couldn't beat it. I think a lot of this relates to performance marketing, the flaws of performance marketing.
00:19:24
Speaker
So yeah, I love performance marketing. I'm a performance marketer. I now also love brand marketing and believes in the They're just, they're totally different things. yeah I subscribe to most of Byron Sharpe's theories. A lot of performance marketing i now view as a form of distribution or what Byron Sharpe calls physical availability, which is it's making the the the product easy to buy.
00:19:46
Speaker
So Byron Sharpe's framework is mental availability. The product is easy to think of in purchasing moments, physical availability. It's easy to buy. There's a concept now of digital physical availability, which means this is just easy to buy via online.
00:19:59
Speaker
So I Google shopping ads, Facebook newsfeed ads. I see those no different as walking down the aisle in Walmart. If you search for something in Google, you're like in whatever, you know, you know, the, the pharmacy aisles in Walmart or the outdoor aisle in Walmart.
00:20:15
Speaker
In meta, if you start engaging on something, it basically put you in that aisle. You start looking at mattresses, you're starting to see all these mattress ads. if You start looking at jet skis, you sell these jet ski ads. So it's, it's to me, it's the equivalent of being on shelf and you're paying to be on shelf.
00:20:31
Speaker
And when you're optimizing performance marketing ads, you're largely optimizing for attention. And this might be a little bit kind of controversial, but only 20 to 30% of people are only 20, 30% of ads in the newsfeed are ever actually looked at.
00:20:46
Speaker
So immediately 70, 70% of people are not looking at the ads. When you do something, you have a shocking hook, a pattern interruption hook, et cetera, it's capturing more attention. And so I think so many of the things that we think are, Oh, this is working it's more effective message for my business.
00:21:02
Speaker
Maybe not. It may just be working because it's taking it from 20% of people looking at your ad to 40. And now all of a sudden your ads performing twice as well. And so through that framework, I distinguish between creative that drives physical, that is effective at driving that physical availability, making it stand out in Google shopping. They can stand out in Facebook newsfeed.
00:21:23
Speaker
They can stand out in brand search. These are all just forms of retail distribution. versus ads that are good at driving mental availability, which are very different and basically impossible to measure digitally.
00:21:36
Speaker
And so I think, is it okay to you know run the same ad for years? If the ad is just us a sign saying, hey, we're Castor, we exist, come buy this mattress, then I think it's fine. If it's truly you know intended to drive mental associations and shift people associate your brand, then maybe it's ah it's a bigger issue or problem.
00:21:59
Speaker
As that relates to kind of what you're building, you obviously are being able to analyze a number of different ads within competitor ad libraries.

Developing a flexible data platform

00:22:12
Speaker
Do you think about kind of sharing or reporting on the kind of brand versus performance mix or developing different levels of analysis or reporting for kind of direct response performance ads versus more branded creative yeah so where we're we what we're aiming for is is as a knowledge-based building and management company so for us that means knowledge of your competitors brand ads and performance ads both and it means that you can flexibly query about how they're doing different strategies There may not be a clean line between the two, but you can ask your knowledge base via chat, GBT, via Claude, whatever.
00:22:57
Speaker
What are their you know bottom funnel sales driven strategies? What are their associative strategies? How are their bottom funnel ads driving mental associations and brand associations? How are their upper funnel, seeing ads, mentioning sales, if at all. So for us, we're focusing on being a more multi-purpose platform, almost ah essentially a data platform or what we're calling like the Bloomberg terminal of marketing knowledge so that people can either data mine it for these insights or they can build automations on these data sets.
00:23:25
Speaker
And so automations, if your competitor drops their pricing, automations, if there is a news event you want to react to, or just automating insights that go to your boss, you can automate email reports and things like that. So that's a little bit of a dodge to say that we don't, you know, we're not pushing very defined workflows through our platform.
00:23:45
Speaker
we're building the platform that people can then customize their workflows on top of. And so they can do that. And this is a big bet of ours is that people are going to want to orchestrate their own AI workflows.
00:23:56
Speaker
Many people will want to buy an all-in-one AI tool like icon.me, or, you know, maybe if motion heads there, But I think even more people, agencies especially, are going to want to build their own workflows and pipelines because it's their own special sauce. I think anyone who's worked with these AI workflow based solutions in marketing or out knows that they can be limited and then you want to do certain other things. It's like, why do we all still use Excel you know instead of QuickBooks for certain things or People will always want to customize and DIY. So our strategy is to build the data platform that they can build on using Claude to build automations on it, using eight N eightn having their developers build automations on, <unk> cetera, as well. So we're a little bit out of that, like workflow game on our end.
00:24:41
Speaker
Understood. I want to go back to something that you mentioned before, which I thought was really interesting.

Challenges in measuring brand impact and AI future

00:24:45
Speaker
on the performance side, ads are measurable in that you can measure the response, the click-through rate.
00:24:55
Speaker
You can measure view-through conversions. You can see deterministically what ads are resulting in conversions, whether they are caus causal or not.
00:25:06
Speaker
And so it's very easy to say, and even looking at like media mix modeling, to say these ads drove this immediate result. So that's more on the direct response end, but then there's incrementality question.
00:25:22
Speaker
For more of the changing mindset, changing perception of a brand, That is much harder to measure, but it's still super important. and add that Seeing an ad for Coke that makes me think nostalgia, that changes my perception and emotion in that moment.
00:25:43
Speaker
But I don't visit the website right away, so it doesn't show up in last click, and it doesn't show up in MMM because it didn't result in revenue. How do you measure that? What do you think about that? It's a big problem. One of the benefits I think of AI automating everything is it's going to automate the performance world.
00:26:02
Speaker
Like performance marketing will probably be solved, which will be really nice for everyone who wants to discuss other things like strategy and brand, because once that's solved for, we'll all stop trying to chase it so hard and we will spend more time on strategy and brand.
00:26:19
Speaker
Those things I don't think are measurable, not in an easy way, not in a big data way, not in a way that AI is there to solve. We want to help solve so that you are as intelligent as possible for that strategic planning.
00:26:32
Speaker
So you know what nostalgia emotions everyone around you is tapping into and where there may be gaps or what consumers are talking about so you can address them. So our product is It can be used for performance marketing, but I do expect performance marketing to be solved and not necessarily by us just showing the trends. It'll be by Google and Meta just running their black box performance, max ASC plus, and they won't just change out the colors and the products. They'll start changing everything about the ad.
00:26:59
Speaker
which is great. It'll leave everything that, that those don't solve for, which is so much. So my two favorite ways of measuring those things, there's brand lift surveys that you can, you know, show people ads and collect the brain survey responses. And she would see which creative shifted the needle on brand perceptions.
00:27:15
Speaker
We used to do those at agency. You can do them for like 30 to $40,000 per creative now, which for small brands is like, okay, that's insane. But for larger brands is, is possible. So that's a good way. And then I think, you know, neuro testing is going to a big thing in the future because when, when we're no longer able to optimize performance, creative people will still want optimize stuff and they will want to optimize brands. and I think it'll become more focused on how do we measure those mental associations and tweak and optimize things for that.
00:27:42
Speaker
Elaborate on the neuro testing piece. Literally like measuring people's brainwaves and their emotional responses to content. I mean, there's the facial recognition companies that will watch as you watch an ad and we'll try to infer but your emotional response to it based on your face. And there's a lot of questioning of how effective those are.
00:27:59
Speaker
But then there's the neuroscience stuff, which will actually measure the brainwaves. And that's not great either because those people are not in real settings. ah They're not at home watching the ad and in that sense. So there's flaws and in everything. And then you can do run real-world experiments. you know The other issue is that people are unwilling to run experiments for as long as they need to to measure them.
00:28:20
Speaker
So if you really want to measure, okay, is nostalgia going to drive more Coke sales? Uh, what I would do is set up a match market and in a small region of the U S pump up our nostalgia advertising, and run sales lift campaigns.
00:28:33
Speaker
And so those are kind of the methods that would be at people's fingertips, but there, I think there'll still be a lot that you just can't big data your way out of, and good old fashioned human strategy, hopefully, you know, informed by AI research will be the, will be the solution.
00:28:49
Speaker
Yeah, and I think there's an element of this where

Optimizing ad targeting with AI

00:28:51
Speaker
marketers have conviction about a strategy or a philosophy and they need to deploy that and then have trust and faith that it will work out and maybe some of it is not measurable.
00:29:03
Speaker
I mean, even for example, I'm in the process of structuring a retargeting experiment right now. where we are looking at email submits, one day email submits and looking to drive incrementality there through paid retargeting.
00:29:21
Speaker
And the like common best practice knowledge would be to optimize those campaigns to a purchase or an action. but because of the small audience size and the small amount of event volume i know that running those purchase optimized campaigns in retargeting are less incremental and it's actually not about the tracked conversions but the incremental conversions are the ones that aren't necessarily like going to be last click so
00:29:53
Speaker
I might run a retargeting campaign optimized towards reach or impressions just to flood that one-day email submit audience, and that might be more incremental.
00:30:03
Speaker
Yeah. I mean, I'll, I'll nerd out on this with you for, for a few minutes. So I, I mean, there's, i don't, well, I can say two contradictory things. One there's there, there would be no reason to run purchase retargeting or purchase optimization if the audience is already so qualified, because the whole point of the optimization algorithm is to ah qualify the audience. So you've already done that. Those people are already in your retargeting pool because they've been qualified and probably reached on a retargeting are on a, on a purchase optimized campaign anyway. So they're qualified to bid reach.
00:30:34
Speaker
The reasons it not bid reach is, you know, for my all my testing understanding, you know, it's it's all an economic marketplace and different people command different prices. So if you're buying a lot of stuff through Instagram and Meta is going to know that.
00:30:48
Speaker
And you're going to command very high CPMs or or or revenues on you. So ah you buy a lot. You're only going to hit by purchase targeting ads and people are going to pay a lot to reach you.
00:30:59
Speaker
And if you bid for reach, you're telling Meta, I want to maximize my reach for every dollar. So Meta is trying to minimize your CPM. So at that point, meta isn't going to bid on someone who's super expensive. They're only going bid on the cheap people. And when we've run reach campaigns and purchase campaigns against the same audience, we see like 5% overlap, if that.
00:31:19
Speaker
And mine are same. It has always been that it's because the purchasers command higher prices and you'll never reach the purchasers when you're running reach campaigns. And so that's actually, I used to think of reach campaigns as offline purchaser campaigns.
00:31:31
Speaker
They're targeting people who either don't have a history of buying online or aren't buying a lot because they're not, you know, necessarily affluent or, you know, in that category. So that's what we would think of like the, the purchase targeting purchase optimization as for online buyers and then a reach optimization as for offline buyers. And we often would see that our each optimization worked better in the, you know, offline list studies that we used to run with like Oracle and data logics ah back in the day.
00:31:55
Speaker
Yeah. yeah You get what you bid for essentially. Yeah, exactly. Yeah, right. hes Yeah,

AI's capabilities and limitations

00:32:00
Speaker
100%. If you bid for clicks, you get those. i want to talk a little bit about where ai breaks down because, you know, i and many marketers are thinking about a future where everything will be automated.
00:32:17
Speaker
But I'm going through a process now where I'm listing out all the tasks that our broader marketing team does on a recurring basis. Everything from financial forecasting to OKR development to concepting, ad launching, organic posting, you know, email, audience creation, et cetera, et cetera.
00:32:39
Speaker
And then going through this whole prioritization framework of like, where can we automate and and where can we be efficient? One of the scoring criteria is like the importance of getting that data right. I can't think of the exact term, but it's the idea that you wouldn't want to automate something that if gotten wrong could really screw something up.
00:33:02
Speaker
And so, you know, we're having a, it's, it's fun to think about, I won't have to do anything. ai will automate all of these processes. It will just know the campaign to create, create the campaign, launch the ads, optimize the ads, whatever.
00:33:15
Speaker
But there's, there's components where it breaks down. So from your perspective, where, where does that actually happen? Yeah, I mean, there's so, i think there's there's AI has so many weaknesses, so I'm excited to to discuss this. You know, the way the way, like the best analogy for AI is that they're like compression engines, that they compress the entire world down into like, you know, a small model that when you're putting information in,
00:33:40
Speaker
it's essentially you know uncompressing it to to give you the answer. So the the way i like to think about it is essentially it has the whole world boiled down into a series of patterns. And I ask it you know what is you know what are some my ideas for dinner tonight? And so it has all these patterns about food and recipes, et cetera. And so then it can take my query and understand it and then expand out from that and produce the final recommendations. But what kind of happens a lot is that it mixes and matches without you even kind of knowing, and it'll freely just mix and match different patterns. And that's why like they've done tests and Moby Dick, the most an AI can actually reproduce of a piece of text.
00:34:22
Speaker
is 41 of Moby Dick. So it doesn't, while we think AI knows everything, it actually could never play back for you what's in a book or what's in a specific thing perfectly because it knows the patterns that predict Moby Dick and it knows the patterns that predict this book or this video, but it doesn't actually know the entire thing in detail.
00:34:44
Speaker
So those have been kind of like informative analogies and frameworks for me to kind of be grounded in the limitations of AI. ah I think the LLMs themselves, they're these thinking machines. I think of them as thinking machines even more than like knowledge. And I'm super biased because we're building a knowledge company, but it's this idea that they they know how to think, but if you want them to actually react to facts, it can't be stuff coming out of their own training data. It has to be facts that are in their context window. And that's why context engineering is the new prompt engineering. And any good AI company largely is all about the way it's managing the context window.
00:35:21
Speaker
Even ChatGPT is like the, when ChatGPT has a conversation with you, that's a lot of human ingenuity and design that makes it feel like a conversation.
00:35:31
Speaker
Every... call to open AI's LLM is completely independent of all the others. It's stateless. So it has no idea between one query and another query.
00:35:41
Speaker
The reason that chat GPT actually retains what's going on is because after you send the first message and you send your second message, OpenAI, the company, has decided to take all of the information from your first message and paste it in ahead of your second.
00:35:56
Speaker
And then it responds. And on your third message, it's taking all the information from the first three and pasting it ahead your query. So every time it's getting a new query, it doesn't know what the hell went on before, but OpenAI has decided to feed it everything from before, and now it can have what feels like an intelligent conversation.
00:36:11
Speaker
And then when it starts to get really wacky, that's because your conversation has gone on so long that they're now summarizing the middle of the conversation. They're taking the first message or summarizing the middle. They're taking the last one. They're sending it through. And so when you're using chats and you play the experience, you have to restart the chat.
00:36:27
Speaker
It's because it's that memory system. When you're using and AI that does automated email outreach, again, it's how people are informing that context window. So AI really is a tool. It's limitations are that it doesn't know Moby Dick word for word and all the implications of that. Like it does, it struggles with actual facts. it has the patterns of facts, which means it's likely to mix and match and recombine. It can make it very creative.
00:36:55
Speaker
But it also is what has you know led to hallucinations in the past it's just like, oh, these are kind of patterns. I feel like this is right. I'm just going to answer that. So I think for AI to be reliable, it's so it it's okay. What is it good for?
00:37:08
Speaker
it can do certain things like ah as a tool, it can summarize and synthesize incredibly well, which means you put information into the context window. You ask it to bring that information together to cross check it.
00:37:18
Speaker
It'll do that incredibly well. It can watch and describe things incredibly well. It can record the attributes and the labels, all of that. It can create images. Everything else is kind of like built around it. I view as the product of human intelligence and engineering where people have decided to string together You got LLM calls. I'm going string together memory, which sends all the old messages in. I'm going string it together with another AI that's going to summarize the messages in the middle, another LLM um call. And then we're going string all that together with another one that produces, you know, an image based on what you're asking for. So all that to say, like those kind of like, I talked about a bunch of different components of AI, but those kind of all lead to my worldview of like, what is it good at? What is it not?
00:38:02
Speaker
You have to look at the thing you're trying to solve and say, okay, you know, it, AI's ability to summarize information to be useful in automating this. Are we going to be able to feed it the right information so that it has the right facts that it needs to automate it?
00:38:15
Speaker
And then then if you can do all that and you can engineer the system around it and you feel like that's going to work, then it's like, okay, this is a good use case for AI. like We're comfortable using AI to watch all these videos because it's very good at that.
00:38:26
Speaker
We're comfortable using it to synthesize what's in our knowledge base because it's very good at that. When you start to, I mean, start, if you ask it to do math, it'll confidently do math for you. And then it'll completely blow it. It'll make up statistics like with, you know, no issue.
00:38:40
Speaker
So that's kind of, yeah. I mean, so I guess if that, if that, you know, answers the question. No, that's that's helpful. I'm realizing that I have a ah meeting to to hop to that's on my calendar.
00:38:51
Speaker
So I think I'm going to, I'll ask one more kind of rapid fire hot take question, and then we can then then we can hop.

Learning AI marketing on LinkedIn

00:39:00
Speaker
I guess my my question would be, what is the best resource for marketers looking to operationalize AI tools?
00:39:07
Speaker
operationalize them. Oh man. I LinkedIn following smart people on LinkedIn is like everything. Who do you recommend following? ai I'll give a shout out Jake Abrams at Odyssey agency. There are the influencers motion seems to have like half the influencers on their payroll, ah the AI marketing influencers.
00:39:24
Speaker
ah Sarah Levenger does great creative strategy work. Jimmy Slagle, I think is one of the guys who's who's putting out AI work. There's a few others. And then for non AI Byron Sharp, Mark Ritson, but you know, people who aren't just promoting themselves, but are truly sharing what they're doing. i would say LinkedIn is probably the number one resource.
00:39:42
Speaker
Cool. James, se thank you so much for joining the call today. Yeah, thanks for having me, Paul. See you on the other calls.