Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Why Startups Struggle With Marketing Data (and How to Fix It) | Barbara Galiza image

Why Startups Struggle With Marketing Data (and How to Fix It) | Barbara Galiza

S1 E30 · The Efficient Spend Podcast
Avatar
42 Plays1 month ago

SUBSCRIBE TO LEARN FROM PAID MARKETING EXPERTS 🔔

The Efficient Spend Podcast helps start-ups turn media spend into revenue. Learn how the world's top marketers manage their media mix to drive growth!

In this episode of The Efficient Spend Podcast, growth and marketing analytics consultant Barbara Galiza shares insights on marketing data audits, triangulation, and optimizing paid versus organic strategies. Barbara discusses key challenges that startups face, such as data discrepancies, budget allocation, and balancing short-term performance with long-term brand-building goals.

About the Host: Paul is a paid marketing leader with 7+ years of experience optimizing marketing spend at venture-backed startups. He's driven over $100 million in revenue through paid media and is passionate about helping startups deploy marketing dollars to drive growth.

About the Guest: Barbara Galiza is a growth and marketing analytics consultant with over 10 years of experience helping startups optimize their media spend and measurement strategies. Passionate about simplifying complex marketing data, she has worked with diverse clients to improve performance through methodologies like marketing mix modeling and attribution.

VISIT OUR WEBSITE: https://www.efficientspend.com/

CONNECT WITH PAUL: https://www.linkedin.com/in/paulkovalski/

CONNECT WITH BARBARA: https://www.linkedin.com/in/barbara-galiza/

EPISODE LINKS:

https://www.barbaragaliza.com/#about-me
https://021newsletter.com
https://blog.adroll.com/marketing-incrementality-testing/
https://www.marketingevolution.com/knowledge-center
https://paramark.com/blog
https://www.getdbt.com/blog

Recommended
Transcript

Impact of Data Discrepancies on Business Decisions

00:00:00
Speaker
there There are multiple consequences or reasons why discrepancies can be affecting marketing teams. I think one is, it comes back a bit to those three buckets that we've talked about.
00:00:12
Speaker
You know, certain data sources can be very valuable for campaign insights and others can be very valuable for budget distribution. If a marketer doesn't know on which buckets a data source falls in, they could be making, you know, the wrong budget decisions.
00:00:28
Speaker
So that can really affect you know the overall growth of the company. And then you also have situations where you know internal teams are reporting on different sources and then there can also be you know friction between different marketing teams, between organic, between paid, on who grabs the conversion.
00:00:49
Speaker
So that can also be a very negative effect. consequence because instead of everyone working together to grow the business, then you have this friction on what data source are we looking at? My strategy is not being represented fairly here.
00:01:04
Speaker
And I need to make a point that, you know, my team is doing a good job. And how do I do that?

Introduction to Startup Growth Strategies with Barbara

00:01:17
Speaker
Barbara, thank you so much for for being on the show today. Well, thank you so much for inviting me, Paul. It's a pleasure to be here. I'm really excited to chat and learn about your your experiences. the The way that I kind of want to frame this conversation is how can we help startups grow through through data?
00:01:33
Speaker
You have a lot of experience in in that regard and obviously a lot of focus on on different measurement methodologies. Where I want to start is is just kind of foundationally. So when you work with a ah new startup and you know you're you kind of auditing their account, auditing their their business,
00:01:51
Speaker
As you think about measurement, what are some foundational components of of things that you you think startups need or or data that that they need just to kind of get started? Yeah.

Auditing Marketing Analytics: Barbara’s Approach

00:02:02
Speaker
Yeah, whenever I'm starting a project, so one of the types of projects that I do most often is marketing analytics audits. I always like to start with talking to the marketing team and other stakeholders that needs marketing data to make decisions.
00:02:18
Speaker
So I try to, in this first stage, understand what campaigns are they running and what channels, what goals, what data sources are they looking at, and also look ahead.
00:02:31
Speaker
on what other type of campaigns do they want to start. Are they looking into investing SEO? Are they looking into doing affiliates or or partners?
00:02:43
Speaker
So I use this as a way to map what I expect the marketing data needs to be and what data sources need to be in place. And then after I have completed this first step, I usually start to look into the actual data and auditing that.

Three Buckets of Marketing Data

00:03:01
Speaker
But I would say, you know, when I'm looking at marketing data, I normally like to put it between one of three buckets. I see, you know, one of the buckets is budget distribution and allocation.
00:03:15
Speaker
How do we get the data that we need, the insights that we need to understand how much budget such should each channel receive? Then I look into campaign insights, you know, within a channel, within an ad platform.
00:03:32
Speaker
How does the agency or how does the campaign manager in the company identify which audiences, which campaigns and which creators are performing the best? And then I look into data activation or you know data signals or ah platform optimization.
00:03:50
Speaker
you know What can we send from the data that we're capturing to the ad platforms so they can understand which customers to target?

Challenges in Marketing Data Quality and Accessibility

00:03:59
Speaker
And I like to always divide you know, measurement in these three buckets, because I find it important to to communicate that, you know, a data that is for reporting is not necessarily the same data they're going to use for optimization. Right. Yeah, for sure. i talk about that a lot, the difference between measuring something and actually optimizing or or bidding towards towards that event.
00:04:24
Speaker
And that's obviously a big topic with brands that have longer consideration cycles, determining what to to bid for and optimize for. And we can we can definitely get into that. Within those three three buckets, though, you know spend...
00:04:40
Speaker
determining where to put the dollars is obviously the the most important thing. From a data perspective, for the most part, data accessibility perspective, that that that' spend getting that that spend data pretty accessible, pretty pretty simple to get.
00:04:53
Speaker
I think in terms of the the challenges, probably in the beginning would be more around determining the ah data quality for conversions and and

Importance of Data Literacy in Marketing

00:05:03
Speaker
signals. Is that accurate? Or when you go through this audit, like with with newer brands, what are the typical hiccups? What are the typical challenges you see where you're like, what's going on here?
00:05:14
Speaker
Is it the data quality? Is it crazy campaign naming conventions? what do you see more most often? Yeah, I think a combination of data quality and data accessibility issues.
00:05:27
Speaker
So usually, you know, companies bring me in when they're experiencing one of these situations. And it's either like we don't trust the data that we have. There is a major discrepancy between the different tools that were measuring results, and we don't know what to trust or what to use.
00:05:46
Speaker
So those are usually put either in data quality or data education, because in of cases, you know the data is working fine, the discrepancies are fine, but they don't understand why they are happening.
00:05:58
Speaker
And then you have on the other camp, entirely different companies where they have, and they're usually like engineering companies, developer products, where they have the best data stack.
00:06:13
Speaker
They have fantastic data quality. They're using DBT. They have automated testing in place, but then the marketers can't access it. any of it because everything is you know a very robust, like a dashboard, they can't model.
00:06:29
Speaker
So this in this search for ah perfect engineering workflow or a perfect data quality, they end up excluding the people who need the data the most. And then that's usually products where I also come in, which goes

Measuring Campaign Effectiveness and Discrepancies

00:06:42
Speaker
okay.
00:06:42
Speaker
And then a lot of the time it's even talked about like building a separate stack for marketing. And I think that is not a great idea. But I would say usually these are true two common situations that I see.
00:06:55
Speaker
those are Those are really unique and differentiated situations, and I think they're both worth diving more into. If we look at the the the first issue, you know not having not understanding the discrepancies between data data, is this at its core a data literacy problem, and is this something that you see more often in kind of organizations that don't have enough ah resources from a data analytics or engineering or data science perspective?
00:07:32
Speaker
I wouldn't say so because... I think marketing data in a lot of ways doesn't make a lot of sense. And i think if you don't understand like the core of marketing, if you don't understand like user journeys, cross-device, time to conversion, the difference between server side and client side, then It's difficult to understand if a discrepancy it means that things are working as they should because they're measuring you know different metrics in unique ways, or if it means that the tracking is not working.
00:08:12
Speaker
So it does, you know, in a lot of these analytics audits that I do, it's it's a big part of the project is debugging and troubleshooting because I can't tell for certain if a discrepancy is within the real realm of acceptable or if something is wrong.
00:08:31
Speaker
So my recommendation, you know, usually for for these kind of issues is, you know, step one is, you know, validate the events are fine as they should.
00:08:42
Speaker
then monitor your discrepancies. Understand what what a discrepancy between meta and Google Analytics should look like. And then if that increases suddenly or decrease suddenly, then you can can go on to investigate.
00:08:58
Speaker
Because it's very easy for data and marketing teams to be stuck in this rabbit hole of trying to always getting these numbers to match.
00:09:09
Speaker
Right. And it also takes the the focus off of just being a marketer and and growing the business. I wonder, you know we're we're humans making decisions with with this data and trust is an important component.
00:09:25
Speaker
The anxiety of looking at these large numbers and not trusting them is super stressful. When you're managing millions of dollars and you're expected to drive revenue and drive growth with that,
00:09:39
Speaker
and you don't trust the data. Holy shit. like That is a big problem. And it's one that you know sometimes I even think about, like we're looking at these numbers, like how do do we can we actually trust that this thing spent this much money or this this thing had this many conversions?
00:09:54
Speaker
However, I also think that to some extent, some discrepancies can almost be, maybe vanity metrics, not the right term, but you know, it's causing anxiety where if the business is growing, who, who cares to a certain extent? Like, do you, do you see that when folks come to you that are dealing with discrepancies or, or dealing with bad data?
00:10:17
Speaker
Is it, is it a lot of times that it is negatively impacting the business in like a very tangible way, or is it more kind of like an internal anxiety that these marketers and founders are, are kind of facing?

Organic vs. Paid Marketing Effectiveness

00:10:29
Speaker
Yeah. I think there is there there are multiple consequences or reasons why discrepancies can be affecting marketing teams. I think one is it comes back a bit to those three buckets that we've talked about. you know Certain data sources can be very valuable for campaign insights and others can be very valuable for budget distribution.
00:10:52
Speaker
If a marketer doesn't know on which buckets a data source falls in, they could be making you know the wrong budget decisions. So that can really affect you know the overall growth of the company. And then you also have situations where you know internal teams are are reporting on different sources.
00:11:12
Speaker
And then there can also be you know friction between different marketing teams, between organic, between paid, on who grabs the conversion. So that can also be a very negative situation.
00:11:24
Speaker
consequence because instead of everyone working together to grow the business, then you have this friction on what data source are we looking at My strategy is not being represented fairly here.
00:11:35
Speaker
And I need to make a point that you know my team is doing a good job and how do I do that? Right. Yeah, i I think about that internally as well. Like, you know, at my full-time role where there's certain organization there's certain teams that look closer at GA4 or they look at last touch.
00:11:56
Speaker
And then, you know, like the paid team, we're looking at incrementality. We're looking at data. and And then there's other teams like the the social team, which is very concerned about brand awareness, impressions, and engagement.
00:12:11
Speaker
And there's definitely these silos that exist to a certain extent. And maybe that's okay. like The social team should be thinking about how do we build brand awareness and and focusing on that.
00:12:22
Speaker
I've and this is still something that I'm in the process of adopting, I've kind of seen and MMM as a really strong way to... aggregate a lot of this stuff up, at least from a spend perspective, because if you have enough spend behind something, you can get a read in MMM. Now there's obviously challenges there as well.
00:12:42
Speaker
But I guess as you think about the kind of like siloed approaches to measurement across different teams, do you think that is a large problem that requires solving? Is it okay? Like what's your kind of read on that?
00:12:58
Speaker
I think it's perfectly fine because I think that falls within, you know, campaign insights. If you're looking into, let's say, email, you should be looking at data sources that make sense for email.
00:13:12
Speaker
And that's not necessarily going to be the same data source that makes sense for paid media or makes sense for organic social media. So I think, you know, within a team, they should be using...
00:13:25
Speaker
whatever metrics, data sources, tools that makes the most sense for the activities that they do. And I do think you know using and overarching data point like MMM or how did you hear about us survey, something you know along those lines is best fit for the budget distribution part of it.
00:13:50
Speaker
So let's talk about that because I agree with you that different teams should be held to different metrics and should measure their activities unique to them.
00:14:02
Speaker
However, there's also, i think, different capabilities within different teams and different propensities to want to measure stuff in a sophisticated manner.
00:14:13
Speaker
From a paid marketing perspective, we're probably the most advanced where we're looking at different measurement sources, I'm having conversations to talk about triangulation, right?
00:14:26
Speaker
The PR team might not necessarily be thinking about that or care as much, right? Like there's something about paid marketing where we have, where we care a little bit more about everything needs to be measured.
00:14:40
Speaker
I see on LinkedIn a lot of times more on the the content marketer space, like, hey, we don't have to measure everything. We just need you know, sometimes you just need to do it because it's good marketing. so But to me, right, right. which like i get you know i get And some of the some of the the top marketers that that I respect talk about that as well.
00:15:00
Speaker
But I do wonder for some of these, I guess specifically if we can talk about for a second, for for non-paid marketing, right? Organic content, influencer marketing, content marketing, building you know things that are building up that audience, building that brand over time.
00:15:18
Speaker
Do you think that we should be measuring that more stringently or is it okay to to not approach it in that same lens that we look at paid media? I think like you touched, I think paid media is is naturally more data-led.
00:15:35
Speaker
it's It's the nature of of the initiative, right? from From selecting a dollar amount that a click is worth to optimizing funnels and conversion rates.
00:15:47
Speaker
It is very much like a numbers heavy activity. And I think that's why it's important that teams can also own which way they want to go.
00:15:59
Speaker
I think it's important that you know content teams or you know organic teams have a way of measuring to understand you know we're within the realm of expertise, you know what content pieces should they produce more of.
00:16:16
Speaker
What content pieces are customers finding valuable or leads finding valuable? which Which topics of social media posts are ah reaching their ICP?
00:16:28
Speaker
right They're measuring different things, but they should still be measuring this so they can understand what to do more of and what to do less of. Does it mean that they need to get the ROAS of or you know not the ROAS because there's not ad spend, but can they does it mean that they need to have an ah ROI precise per each social media post? Like, no, obviously, you know, that's not feasible.
00:16:53
Speaker
Even with like MMM, right? it's It can be difficult to really understand the impact of organic if there's not enough volume or or variation into the data sets.
00:17:05
Speaker
But they should still have some signals. and that And those signals can even be qualitative signals. It can be in no asking leads what articles they read. or you know Now um I've just said something top of my mind.
00:17:17
Speaker
But you know it can be other types of signals. But they should still have something that it helps them understand what is working and what's not working.

Budget Allocation: Paid vs. Organic Marketing

00:17:27
Speaker
Sure. As you as as you look at kind of you know early stage startups that are that are growing right now, I see more folks thinking thinking about organic as a little bit more foundational and and even in investing in that.
00:17:43
Speaker
And I wonder, just like to give a hypothetical, right if if I'm a early stage B2B software and I have $100,000 monthly know what?
00:17:54
Speaker
budget and i say you know what Barbara, I want to allocate 50K of this to paid. I want to do some paid search and some Facebook. I want to allocate 50K of this to organic.
00:18:05
Speaker
How should I think about measuring that? do you how do you How would you kind of approach something like that where you have these two different things? Is it is it that we look at the highest level of like, okay, is are we getting traffic? Are we getting you know email smints? Is revenue growing? And then we measure each of these different areas completely separately.
00:18:29
Speaker
Is it something different? Just wonder like tactically what what you would think about that type of situation because i think there's more and more folks that... And like even our team, which is more at scale, want to invest more in organic, but want to have a way to understand like what is the trade-off if we're spending less on paid and more on this like harder to measure thing.
00:18:51
Speaker
Yeah. i I always see like my general take is that the role of paid is not necessarily of a whole separate initiative, but it's more about pouring gasoline into the fire.
00:19:05
Speaker
That's what I always like tell clients. So if you're investing in organic, what I'm imagining you're saying is you're investing in content c creation. You're investing in in producing valuable assets.
00:19:19
Speaker
These are all assets that paid can help distribute. So ah and the like in the end, there isn't that much of a distinction.
00:19:29
Speaker
between the two activities. You know, you can't really succeed on LinkedIn without a variety of content assets. So you can't do that in silos.
00:19:42
Speaker
Again, search, there's only so so far you can go if the only thing you have is a homepage. right You need to have use case pages. You need to have industry pages.
00:19:53
Speaker
You need to have you know like SEO tools that you're offering. So all of these things are also necessary to make page search work. So that's and I'm not really answering your question.
00:20:05
Speaker
But I guess you know my answer will be those things are intertwined and they need to work together for page to even be successful in the first place. 100%. I think that's kind of the point. and And the thing that I'm getting at is that the the paid versus organic mix is a really

Triangulation in Marketing Measurement

00:20:25
Speaker
interesting one. And I think it's one that's worth optimizing and and thinking through.
00:20:31
Speaker
I gave the example of a brand spending $100K, but let's you know if we take the example of a brand spending $50 million dollars a year, seven-figure monthly budgets, those those folks could be over-invested in one area or the other.
00:20:48
Speaker
they might have some you you know They have all the website pages that they need. They have a little bit of a social presence and they're doing paid. And then it becomes a point of like, what is that balance, right? Like if you are, when you're spending seven figures a month, you're serving billions of impressions on TV.
00:21:06
Speaker
Do you, how do you think about trading off? Okay, I'm going to serve 10 million less impressions on TV and trade it off to get ah million impressions of high quality influencer organic content.
00:21:19
Speaker
Like that's like a really interesting like thought exercise that I'm having right now and trying to figure out. Yeah, yeah, that's that's that's a very difficult question to answer.
00:21:30
Speaker
But I do think the advantage of having, you know, bigger budgets is being able to run incrementality tests and have a more accurate in the end. And then I would say i wouldn't be thinking of organic in the point of content production, but I would be looking more at the performance of individual content channels.
00:21:51
Speaker
So, you know, looking at how organic search is performing, looking into, you know, setting up control groups in email to understand the impact there and look at it on a channel by channel basis.
00:22:04
Speaker
Right. Let's talk about triangulation for for a little bit. So I think triangulation is is one of those words that it's now, it's kind of like a ah hot topic and highly debated, at least on my LinkedIn feed.
00:22:18
Speaker
Just thinking, you know there i think that there's there's one perspective and one approach to to say that It's not actually about triangulation. It's about different measurement methodologies for different things, for different use cases.
00:22:31
Speaker
Myorsadra of incremental kind of talks about that a little bit. And then there's another approach to say, okay, for any given marketing activity and specifically paid marketing activity, paid marketing spend change, we can look at multiple methodologies to evaluate its effectiveness and determine what to do next.
00:22:53
Speaker
And then there's just folks that, you know, aren't doing any triangulation or relying on last click. So as you think about that, that world, you, you have a lot of experience with this. You have so much measurement methodology and tools at your, at your fingertips.
00:23:07
Speaker
How do you think approaching, how do you approach that and how do you take advantage of these, these different methodologies and tools? i think I think the most important thing first is what channels are you running?
00:23:21
Speaker
Because i've I've seen brands that were pretty much only running search and they had 300,000 monthly spend and it was all search. If that is your situation, then you're fine with using a click-based model because search drives clicks.
00:23:37
Speaker
And usually they have conversions within the first session. so In most cases, you're good on measuring that part. The question big of triangulation or, you know, other forms of attribution is when you start to do strategies that don't generate clicks.
00:23:55
Speaker
That's when things get much fuzzier, is when you're doing YouTube or even, you know, okay, meta is driving clicks, but they're driving, you know, cross-device clicks.
00:24:08
Speaker
So again, you know, using like a click-based attribution but model won't necessarily reflect the results there. So I think in this situation, you must look at other forms of attribution.
00:24:22
Speaker
there There isn't... There isn't a cop-out because if you're only looking at click-based attribution and I don't think it matters that much like this first click or last click, right?
00:24:33
Speaker
The reality is that measuring anonymous users nowadays, it's so difficult that at times, yeah, most user journeys on MTAs are going to have one, two touch points, right?
00:24:48
Speaker
So We also we have this belief that you know an MTA is is is measuring 18 marketing touchpoints, but most likely those 18 touchpoints are going to be split between like six anonymous users identified by whatever platform you're tracking in.
00:25:05
Speaker
So I think the issue becomes, you know yeah if you're only using click-based attribution and you do YouTube and you do meta, I can tell you that click-based attribution is going to tell you that the campaign didn't work.
00:25:17
Speaker
So you need to have other forms of attribution or you are you are stuck only doing one type of marketing because that's the type of marketing that attributes fairly on an MTA model.

Attribution Models and Marketing Impact Measurement

00:25:31
Speaker
ah there's There's a bit of ah absurdity to this because you know i've I've focused a lot on this. I've built out this really complex attribution capabilities chart for our team internally, looking at each of our channels, how what the attribution and window is, what's what's being measured. And then you know you get into Webverse app and you get into probabilistic data and model data, and it gets very confusing. And then things change and then new attribution models come up.
00:25:57
Speaker
come out. And so there's, there's a bit of, there's this, there's this sad reality that no one really understands this stuff to a perfect level, perhaps, and taking the time to do so just doesn't make sense for, for many businesses.
00:26:14
Speaker
I want to ask a ah specific example about triangulation, which, which, which I'm seeing right now. so Paid social channel that we spent a lot on over over the years.
00:26:27
Speaker
We are driving to app installs, right? But we have a web and app-based conversion funnel, so folks can can convert on web as well.
00:26:38
Speaker
Scan data showing, you know, High cost per purchase, according to iOS scan, showing pretty high CPAs, blended CPAs that look kind of subpar.
00:26:53
Speaker
Geolift incrementality test showing very, very strong lift, which counteract the results from that and obviously scan has all of its issues.
00:27:05
Speaker
So, and this is kind of looking at, you know, channel over, over the kind of same similar time period. So not a lot of change happening, happening there.
00:27:17
Speaker
When you think about a discrepancy like that, right, and you're just seeing different reads, how do you know where to trust how do you know what to trust and how do you think about approaching that problem of determining where to go next?
00:27:29
Speaker
Yeah, that is that is a difficult problem because the discrepancy between incrementality and MMM shouldn't be that significant, right? That's on paper the whole point of MMM, to be able to spot the incrementality.
00:27:44
Speaker
So you need to validate which one is... is correct here. Like the scan data, that is that is faulty for many, many ways. So it is what it is, I would say.
00:27:57
Speaker
But I would try to investigate the MMM and incrementality and why they report different numbers. Have you done only one incremental test? Because it could be something wrong with the data there on the DMAs that you've selected.
00:28:10
Speaker
So it can be good to do another incrementality test. And I think if the other test shows the same result, my recommendation would be to yeah to follow those results and see where it takes you.
00:28:24
Speaker
So let's say you know you can add a multiplier to your bid to where you're running this campaign, where it takes into consideration the incremental difference that that is reporting.
00:28:35
Speaker
and if the budget on this strategy is significant then if the incrementality test is right then on paper you should see more growth right at an overall cheaper cac so that's the same way that you know when people ask me like oh how do I validate my MMM model how do I know if it's working like there isn't there isn't way to really do this, right?
00:29:04
Speaker
There isn't a ah ah simple way of checking if the model does reflect the the results that you're seeing. But if you follow the model on paper, you should see a lower CAC and more growth, right? Because that's the whole point of it.
00:29:21
Speaker
And if you follow, you know, what it says, okay, the ROI here is four. So if you spend twice as much, you know, you're going to more installs, then that's what you need to test and see where it takes you.
00:29:35
Speaker
um Yeah, that's a great answer. I think that's exactly the the way that I'm thinking about it and and approaching this problem. And, you know, the in terms of coming up with a decisioning framework on on these different methodologies, I think MMM is great as a high level kind of hypothesis creator, channel channel budget planner, like great at high level stuff.
00:30:00
Speaker
Incrementality test is a way to validate the results of that. And I think like last touch or deterministic or attribution data, click-based, view-based data is is basically like the...
00:30:14
Speaker
the most straightforward and frequent data source to make the subtle nuances and changes of. I wonder what your your thoughts are on that. And I'll just give ah another example, right? like Let's say you you have a multi-channel media mix and you run the MMM and it says, okay, you spent 20% on Facebook last quarter. You should be spending 30% of your mix here.
00:30:40
Speaker
So you you do the you make a plan to increase the the budget allocation to 30%. You do that over the quarter. Maybe you run an incrementality test. And then I think you use the attribution data as this kind of like guide throughout the quarter because you're not going to get the read from that. you might not get the read from the incrementality test right away.
00:31:00
Speaker
How do you think of about approaching that based on like the availability of of data as you're kind of operating? Yeah, I think like one of the things that I should have said to your original question is your MMM model should be being adapted based on your incrementality results.
00:31:20
Speaker
You know, those those two sources should be working together because MMM becomes the most accurate when there's more variation into the data.
00:31:31
Speaker
So on paper, you know, if your incrementality test is saying like bad ah ROI, your MMM model should be adapting to that. So it should be, you know, yeah, like I said, like creating maybe more incrementality tests for that.
00:31:46
Speaker
And then regarding the... the question of you know what to do. as you're waiting for four results. i think like i mean yeah I think in the end, it also all depends on how nimble you are with budget allocation.
00:32:03
Speaker
I think that's that's the the first question that I would ask because you know usually what I see from my clients is that they're thinking about budget distribution once a quarter.
00:32:14
Speaker
And then within that quarter, they're looking into within the different channels where should they be allocating that budget? So when you're thinking about where to allocate budget within a channel or within an ad platform, that's where the ad platform data is the most useful.
00:32:35
Speaker
And that can be click-based, it can be server-side, it can be white paper downloads, it can be VCR. But that's the kind of data that you'll be relying for your intraflight optimizations.
00:32:48
Speaker
Sure. The cadence is really important here, um for sure. And although a lot of um MMMs now are are selling more frequent calibration, more more frequent updates, you know a lot of the folks that that I've talked to from Recast, LiftLab to Paramark are...
00:33:08
Speaker
talking about how they're building this Cassandra, building this modern MMM to to calibrate more more frequently. However, even so, I still believe that like on a campaign level, on an ad level, like there's not much that you can do to take advantage of MMM there.
00:33:26
Speaker
Curious what you think about that. Yeah, i've seen I've seen brands be able to distinguish MMM like brand and generic search. So perhaps you can have like a distinction like that depending on your spend levels.
00:33:39
Speaker
Maybe also like prospecting and retargeting for something like meta. But you're not going to be able to get a creative level and insight. Sure.
00:33:50
Speaker
I wonder, so, you know, We talked about some challenges that folks come to you come to you with, and I've highlighted a couple of of challenges that I'm kind of looking

Balancing Performance and Brand-Building in Marketing Strategy

00:34:01
Speaker
at.
00:34:01
Speaker
I think an overarching theme and an overarching challenge that that I have when looking at... kind of mid to, I would say mid-stage startups that have grown to a certain scale, are operating at scale, and have to balance the short-term performance goals of hitting their CAC targets, hitting their ROAS with kind of long-term brand initiatives.
00:34:30
Speaker
That challenge is where measurement kind of fits squarely to to to solve. Is that something that a lot of your clients come to you with?
00:34:41
Speaker
Like, can you talk a little bit more about maybe some of the the marketing challenges in terms of like objectives, brand building goals that maybe clients come to you with?
00:34:52
Speaker
Yeah, I would say that that's a very big challenge. You know, how much so should a company spend on brand campaigns? that is That is something that, yeah, people really struggle to identify and answer.
00:35:09
Speaker
You can use something that I've seen being used and I've used for clients is using things like search console. You know, like you're running a brand campaign. How does that impact an organic brand search?
00:35:23
Speaker
How does that impact direct traffic? You can also do you know brand lifts on the ad platform, but it's you can do geo tests too. So I think there is all these small solutions, but I don't think I would have said this is a challenge that I have successfully cracked.
00:35:43
Speaker
it's it's it's It's a difficult balance to... Yeah, to find. and And some of the times, indeed, it's more intuition. It's more where's the market going?
00:35:57
Speaker
What competitors are are coming? what do What is the motivation from our customers to choose us? There's just also all these other strategies and momentum that also affects how you think about brand marketing.
00:36:15
Speaker
So I don't think there is, i at least I don't have a definite answer for that. Yeah. As you're speaking, I think i think about the the idea that this this challenge of how do we grow the brand and establish the brand maybe falls on the minds of the marketing team and the CMO.
00:36:38
Speaker
They have to think about that. And so They think, how do I solve this problem? Well, it's more brand marketing where a lot of times what it might be is a product shift or a change to the way that what retention might look like.
00:36:58
Speaker
And so I think a lot of times we as marketers face the, because we are in control of the budget, and control of the CAC and the ah ROI, we face this pressure of this number is moving.
00:37:13
Speaker
What can you guys do to fix it? Where sometimes it's not a marketing problem as much. Yeah. and And sometimes, you know, with brands, it's not about being reactive.
00:37:27
Speaker
It's not about seeing what the numbers tell you and then making a reaction based on it. It's about understanding where the product wants to be and then using brands to get the company in that place.
00:37:40
Speaker
But the ways that I've, yeah, I think the ways that I've seen brand campaigns being measured, because, you know, obviously we do attempt that. Geotests, right?
00:37:50
Speaker
So DMA testing for YouTube or even for billboards. Also doing, you know, certain DMAs and then analyzing the...
00:38:01
Speaker
overall impact that that has on on sessions and then trying to estimate what the ROI was. but But it's still, and I mean, even can control for long-term impact of things with stock effect.
00:38:17
Speaker
there're There's still, you know, the nature of some of this stuff where it just is very long-term and it's just really hard to to measure. And they're both risks, right? They're both risks.
00:38:30
Speaker
You have the risk of not spending on brand. And that's something, you know, marketers talk a lot thought a lot about. You know, you're only focusing on bottom funnel and you're not building the demand.
00:38:41
Speaker
And then, you know, your growth stagnates. But you also have the risk of companies thinking they need brand marketing when it's too soon, when they should be focusing on other types of initiatives instead.
00:38:54
Speaker
So it goes both ways. It's worth, but but it is worth like understanding and and actually knowing that that that is a risk.

Long-term Benefits of Brand Marketing Investments

00:39:02
Speaker
I think about Alex Hermosi kind of talks about, you know, instead of investing S&P 500, invest in the SME 500, right? And it's almost like, well,
00:39:10
Speaker
and it's it's almost like okay well I should put all my money in the S&P 500 because I'm going to get 10% to 15% returns over the long term. And that's going to be fine. And that's going to compound.
00:39:21
Speaker
So I should put the you know couple hundred dollars or thousand dollars or whatever that I can invest every month into that thing and just be patient about it. But then there's a trade-off of, well, and I could take that couple hundred or thousand dollars, invest it into courses, invest it into knowledge to grow myself, to increase my income, this type of thing. So I think it's a similar kind of conversation with performance first brand, like,
00:39:45
Speaker
you know, I'm, I'm investing in this, in this performance marketing because it's going to get me this compound. It's going to get me this, this initial result. And I, but I don't, I don't really know what the exponential effect of investing in brand or something else could be. Yeah.
00:40:01
Speaker
Yeah. That could be the case. Yes. um Cool. I know have a few minutes left. i want to I want to get a ah few few hot takes from from you. So one final thing, the determining the correct metric to to optimize at a high level.
00:40:17
Speaker
You wrote an article about this where you were comparing... optimizing for a CAC and and target CAC, target CPA to to optimizing for for ROAS.
00:40:29
Speaker
um This is something that we're transitioning to kind of optimizing towards predicted ROAS and and LTV and and looking at that more

Selecting the Right Metrics Beyond CAC

00:40:39
Speaker
holistically. As you look across your client mix and clients that you've worked with in the past, like, is this is this a ah big challenge that folks are maybe too focused on CAC or too focused on like a vanity metric that doesn't actually move the needle. Because sometimes I see like there's there's certain changes that you can make in your mix that have just an immediate impact.
00:41:02
Speaker
And one of those is changing the metric that you actually optimize to bid to, look at, care about. Yeah. I think it's just because it's easier to start with CAC You know, when you're optimizing towards CAC, you're basically optimizing towards one event, right?
00:41:20
Speaker
You're counting the number of times that event occurs, and then you take all the budget and you divide by that. That is much simpler than trying to calculate the LTV and or, you know, the ARP or whatever you're looking at for a single user.
00:41:35
Speaker
But when you, especially for like B2B, you have very different user types. Right. Depending on what what's more your monetization strategy, you can have very different plans.
00:41:48
Speaker
You can have a very different number of seats. And obviously, you know, any subscription product, you have the question of churn. You have certain users that, you know, retain for 24 months and a user that, you know, subscribes and then cancels after one month.
00:42:03
Speaker
And if you're only looking at CAC, you're treating all those users as they are the same. So you could be very well leaving money on the table if you're not considering these user differences.
00:42:17
Speaker
So I think companies start on CAC because it's easier. And then when they're starting to think about how can I scale you my budget while maintaining you

Customer Value Variability in Marketing Optimization

00:42:28
Speaker
know efficiency? And I wrote a case study for this. Maybe that's what you read.
00:42:32
Speaker
for for a client recently, and that that's exactly where we' where they were at. You know, they were optimizing for CAC because that was the first thing that they set up. But then I audited their pay media and I said, look, you have very different use cases that your product is offering.
00:42:50
Speaker
And then I went to look at some website data to understand what was the value of each use case. And I said, like, certain features of your product, people stay... and using it for two or three years, where there's certain features, they use it once, so they pay to use that, and then they cancel. so So if you're only optimizing for CAC, you are you're okay, maybe you have a lower CAC, but you have ah maybe a much cheaper or a much lower ROAS, because the people that are worth the same, you're not bidding for them, because you have a cheap CAC.
00:43:26
Speaker
Right. Yeah. It is it is really the more variability in the individual customer value, you see those changes magnified, which makes sense for like specifically like something like B2B, where you could have an SMB customer who's going to be worth $1,000, $5,000 to you versus enterprise, which could be worth with a million bucks. like and You're going to pay the same amount for both of those? Yeah.
00:43:56
Speaker
Yeah. Indeed. Yeah, indeed. And yeah, with them, you know, with this company specific, they have a B2B offering too. And that part I didn't look, but that was just within the B2C space.
00:44:10
Speaker
You had, you know, certain keywords that the LTV was double digits and then i with two with the LTV of, you know, almost four digits. And that is just a massive difference when you're talking about, you know, big budgets.
00:44:25
Speaker
For sure. Last couple of questions. This is the Efficient Spend Podcast.

Efficient vs. Inefficient Marketing Spend Examples

00:44:30
Speaker
Throughout all the clients that you've managed and all the marketing spend that you've seen, I'll start with the most efficient spend.
00:44:37
Speaker
What's the most efficient spend that you've seen? And it can be very granular or it can be high level, however you want to take it. the the most like the the the campaign or the channel or or the thing that you saw that just consistently that drove amazing revenue amazing results yes like throughout your career throughout your career what have you was there any campaign or company that you know you just saw hey this campaign just absolutely crushed or yeah i i one time had this this client this was like almost 10 years ago but they well they they were selling contents
00:45:13
Speaker
like types of content for games. So I don't even know if they still exist in this like AI world of theirs. But their entirely growth was Google Search. And that was the only thing they did.
00:45:24
Speaker
The founders spent a couple hours on it a week. And they were making like $50,000, $60,000 a week on it. And that was just from just paid search.
00:45:36
Speaker
Turns out that they they they they found the use case. that was very high in need and there was no competition and then they yeah they really smashed it most inefficient spend the biggest waste holy shit i can't believe this campaign or this channel that you discovered ah it's a big list that you i mean obviously you know you have brand search paid brand search who hasn't heard of a client that was spending millions a year and then turn it off and then nothing changed.
00:46:09
Speaker
Like, I feel like every marketer has a story like that. I've seen some, yeah, abysmal results for Spotify, like Really, like, yeah, like six-digit spend and nothing. Yeah, programmatic.
00:46:23
Speaker
You can also have an open market with fraud. The weirdest websites it serves in. But yeah, it serves. I think usually the problem...

Conclusion and Future Plans

00:46:34
Speaker
is it will be more within like page search or retargeting is where you think the numbers look good but you just realize that they're just reaching the user right before they convert i mean performance max you know this could be a whole a whole podcast on performance max i can go on on that for a good hour
00:46:55
Speaker
Yeah, it's it's so funny. like Most of these wasted spend kind of examples come with brands that are operating at scale. like You don't see it with someone that's spending 10K a month because it's like they would figure out it's not working pretty quickly, but When you're spending millions a month and you have this thing that spends 200K, you're like, oh, well, like we got revenue, like we're we're good. And that's where it comes.
00:47:19
Speaker
I've also seen, obviously, Brand Search ran a streaming audio test with Spotify and it completely shit the bed. It was like really, really bad. And like, by the way, their reps sucked. Sorry, Spotify, but like the reps did not know what they were doing.
00:47:33
Speaker
Yeah. And programmatic too. So yeah, a line there. Awesome. Barbara, thank you so much for for coming on the show. I'll link all the your your website, your newsletter and in the description.
00:47:44
Speaker
Yeah, it's been an awesome conversation. Perfect. Yeah, no, thank you so much for having me. I'll just also do a little shout out to the attribution course I organized. I had the first cohort that I did, me and Timo Diccio, and then we finished it this month, and then we're going to have another cohort coming soon.
00:48:02
Speaker
So for people that want to learn more about attribution, we cover things like MMM, MTA, how did you hear about us, vouchers, and all the different ways you can measure marketing results.
00:48:15
Speaker
Cool, we will we will link that in the the show description. Awesome, thank you. Thanks for the great questions.