Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Lessons from Building LiftLab Media Mix Modeling and Experimentation Platform | John Wallace image

Lessons from Building LiftLab Media Mix Modeling and Experimentation Platform | John Wallace

S1 E33 · The Efficient Spend Podcast
Avatar
23 Plays2 days ago

SUBSCRIBE TO LEARN FROM PAID MARKETING EXPERTS 🔔

The Efficient Spend Podcast helps start-ups turn media spend into revenue. Learn how the world's top marketers manage their media mix to drive growth!

In this episode of The Efficient Spend Podcast, John Wallace, founder of LiftLab, explores how combining media mix modeling with experimentation leads to more reliable marketing decisions. John shares lessons from building attribution systems, why marginal ROAS should be every marketer’s north star, and how to balance model outputs with human instinct. He also unpacks the role of AI in media planning and what marketers often get wrong when testing new channels.

About the Host: Paul is a paid marketing leader with 7+ years of experience optimizing marketing spend at venture-backed startups. He's driven $250M + in revenue through paid media and is passionate about helping startups deploy marketing dollars to drive growth.

About the Guest: John Wallace is a marketing measurement expert and founder with 24 years of experience in econometrics, attribution, and experimentation. He previously founded DataSong and now leads LiftLab, where he helps brands like Sephora, Thrive, and Skims optimize media spend with science-driven insights. John is passionate about building tools that bring clarity, trust, and economic discipline to marketing decisions.

VISIT OUR WEBSITE: https://www.efficientspend.com/

CONNECT WITH PAUL: https://www.linkedin.com/in/paulkovalski/

CONNECT WITH JOHN: https://www.linkedin.com/in/johnwallace-

EPISODE LINKS:

https://liftlab.com/about/
https://www.amazon.com/Market-Response-Models-International-Monographs/dp/0792381585
https://liftlab.com/blog/maximizing-marketing-with-the-ana/
https://www.invoca.com/blog/media-mix-modeling

Recommended
Transcript

Introduction to LiftLab's Profitability Focus

00:00:00
Speaker
The inputs that we want to provide are things like marginal ROAS. You can just derive from that ROAS the profitability. I like to say that LiftLab does not own the P&L.
00:00:10
Speaker
and what we've observed is across a lot of marketers that quite often you may start the fiscal period out focused on profitability, but everyone throws that out the window if you're towards the end of the period and you haven't hit the growth number.
00:00:23
Speaker
And so the ability to tool teams up to walk into a conference room or a boardroom to say, i can deliver X if you give me Y. I can deliver Z more if you give me Y. That is what we're excited about helping marketers to do.
00:00:47
Speaker
John, welcome to the podcast. ah Hey, Paul, thanks for having

John's Sabbatical and Inspiration for LiftLab

00:00:50
Speaker
me. You ready to talk about mixed models? Because i wake up every day and think about it. Because it's something that is completely unrelated to media mixed modeling and LiftLab, but I think you can maybe find a way to to tie it in When I was looking at your LinkedIn profile doing research for this podcast, I was excited to see that you took a two-year sabbatical mini retirement break from work before starting LiftLab.
00:01:17
Speaker
And I wonder if you could just share a little bit about what you did with with that time, how you spent it, and how that ultimately led to building LiftLab. Thanks for the curveball.
00:01:29
Speaker
i could The very first words out of my mouth to talk about that time is that I was a stay-at-home dad. And so I spent a couple of years, you know, with not babies, they're, you know, teenage sons really spending a lot of time with them.
00:01:43
Speaker
Uh, we set up some projects that we did. ah We actually restored a 1952 Ford pickup truck from my grandfather's farm. So that was a total nostalgia bonding kind of moment.
00:01:54
Speaker
And in my case, it was on the heels of a long run as an entrepreneur. And those entrepreneur days, my former company was called data song. You know really added up in terms of my presence, right? My, even my, you know, my physical shape.
00:02:10
Speaker
I did spend a lot of time exercising, working on projects with my son. We're going to talk about remodeling during this call and I remodeled a home. So it was, these were all, you know ways that I wanted to spend some time and a reset.
00:02:23
Speaker
I did pursue some non-marketing data science passion projects. I worked a little bit on ah trying to accumulate data sets of agricultural yields.
00:02:34
Speaker
See, could we find ways to increase yields in the field? Could we feed more people with the same resources? That was a passion project. I did a little other work that some of my network knows about. I hit people up when I was playing with Could we train better models for a taste recommender?
00:02:52
Speaker
So most recommendation engines are about, say, your music tastes, right? And that problem's pretty elegantly solved by the algorithms in Spotify and Apple Music. And this was about the taste for your mouth.
00:03:05
Speaker
ah So Spotify for the mouth, not the ears. And we were using data sets on literally the chemistry and gas chromatography you know of what's going in your mouth and deciding would that help us predict what else you would like to eat from a menu. and But this all led me back to what I know really well.
00:03:25
Speaker
are the trials and tribulations of getting the most out of an ad budget. The data sets that were going into the multi-touch attribution that was part of a data song in the company that I sold, those data sets were essentially disintegrating slowly over time.
00:03:41
Speaker
And I felt like, what if we were to kind of just clean the decks and think about this same problem of allocating paid media budgets efficiently using a completely new data set?

Commitment to Building LiftLab

00:03:52
Speaker
And that was the Nexus lifeline. that's ah That's an awesome story. And it sounds like you were experimenting, tinkering with a number of intellectual passion projects to kind of feed that.
00:04:05
Speaker
And then committing to one with with Lips fla What was that thought process like? And the the reason why I ask is you go from running a startup, you know, working all the time, intense focus on one thing to focusing on a number of different things.
00:04:26
Speaker
And then I'm assuming you did well financially. You're in a ah good position. Why go all in into a Fab, right? Where you're like, okay, now I have to commit this, you know, you've done it before.
00:04:37
Speaker
i This is not something I commit to for three months, six months. I'm starting a new company. They commit. It's a personal decision. One thing that you, that when you i've kind of proceed your career at building companies is that there's a label that gets placed on you, and that is that you're not hireable.
00:04:54
Speaker
So, you know, how well would a person who's forcing themselves to make a whole so certain type of risk reward decision on a daily basis sit inside of a large organization, right? So, part of that's a little bit self-fulfilling that, you know, that the the decisions that I've made that kind of, you know, up shape where I thrive and, you know where I succeed kind of forget where some of my next decisions will come from.
00:05:20
Speaker
Uh, so I did a little bit of interviewing in corporate America and I kind of came to the similar conclusion that I'm sharing that maybe I'm not that hireable. As far as the decision to, work again on paid media allocation.
00:05:33
Speaker
a lot of that came from the fact that some of my former team had already kind of started on the problem and I was advising them from the sideline. And the more that I looked at what they were doing and the more that we started ideate the more excited I got about the opportunity and it didn't take long to, to throw my hat back in the ring.
00:05:51
Speaker
That's awesome.

Media Mix Modeling and Industry Skepticism

00:05:52
Speaker
I have been working with and using LiftLab for almost a ah year now and went through a pretty comprehensive evaluation process looking at a number of different vendors.
00:06:03
Speaker
One of the things that sold me on LiftLab was the hybrid approach of medium-mixed modeling and experimentation. Can you speak to why that was such an important part of the way that you communicated the the value at LiftLab.
00:06:22
Speaker
And I ask that also, interestingly, because I think that some other companies are starting to mimic that approach. Well, thanks for recognizing that, you know, I think what we like to say is, uh, imitation is the ultimate form of flattery.
00:06:37
Speaker
And when we went this direction, there was a lot of posturing and defensiveness, in, you know, in the vendor world of, you know, why, if you came from the, you know, the experiments background, you try to shoot down the mixed model as not necessary. And if you came from the mixed model background, you'd try to shoot down the experiments as not necessary or or problematic or not end all be all.
00:07:05
Speaker
And I think that the vision that we had over five years ago to combine these two analytic methods in a very fundamentally sound way has come back now to become industry standard.
00:07:17
Speaker
Uh, everyone has fallen on their sword and kind of, you know, mimicked or copycatted some of the innovation. And, I think it's the form of innovation that welcome that. And, you know, it's it, what we'd like to think is that, but with that many years of head start, we we've, you know, been able to look at what works well and and what doesn't. And, you know, we really do have, ah what we call our, it really is a trust engine.
00:07:43
Speaker
This, these decisions that marketers are making with models. require an awful lot of trust. and And, and we can, if you want to take the questions in this direction, I can talk a little bit about the unified econometric framework that does this, right?
00:07:58
Speaker
What I find is frustrating sometimes for ah marketers or buyers, a lot, I can point at lots of medium posts about how you should triangulate ah from the You know, the experiments and the, either the multi-touch attribution or last click attribution of what you're seeing in your mix model.
00:08:14
Speaker
And no one gives a recipe for how to do that. Or if they do, it's kind of spreadsheet math or something like that. And it turns out they're not sharing that because it's a hard problem. And there's like quite a bit of IP wrapped up in that.
00:08:26
Speaker
And that's the part that I feel like we really do have nailed and you know everyone will aspire. and and And look, there's a lot of innovators out there. you know Given enough time, I'm sure they can put something together that's solid.
00:08:40
Speaker
Yeah. I want to get into how we've started to operationalize media mix modeling in conjunction with experimentation, but I also want to hit on another important, I think, thesis or the kind of perspective that you have that is a little bit different than than other folks.
00:09:00
Speaker
When you are a software company, when you are an agency, you're trying to get clients It's very easy to go into, we will personalize for you. We will make an approach that works for your business. You want to run into this KPI.
00:09:14
Speaker
Sure, we can do that. We'll make it work. Just sign on the dotted line. The Lift Lab has been very specific about one KPI is the primary one to be looking at, and that is marginal ROAS.
00:09:28
Speaker
And that was something that sold me as well after talking to a number of vendors, just the idea fundamentally that the most important thing is to think about maximizing that next dollar and your in your mix.
00:09:40
Speaker
I wonder if you could speak a little bit about how you came to that kind of perspective. Yeah. ah You know, it really, I'd love to say that this is just something I was walking, you know, down the block, but my nature was to find as many brilliant, smart people as I could.
00:09:55
Speaker
If we're going to take another, you know, swing it at a problem that we've, that we know pretty intimately. And this advice came from one of our advisors. It was dr Dominic Hansens from the business school, Anderson school at UCLA.
00:10:08
Speaker
And we had a number of conversations about, you know, what, what, how should we Rally, what would be our North star? ah You know, our mantra, if you will. And, you know, the literature covers this pretty clearly.
00:10:22
Speaker
And, what, where we innovated and where he got excited about lift lab as, you know, as a platform or a nascent platform at the time. was how do we bake that into our experiments?
00:10:36
Speaker
How do we rethink experimentation for the purposes of paid media? And that I hadn't seen happen to this date, actually. I've seen that most people have taken off the shelf experiment designs and applied them to paid media. It's pretty convenient. It's like, okay, well I can use an AB design and a is going to be get it, getting the ads and B is going to be go dark and I'm going to go, you know, calculate the lift and that type of thing.
00:11:03
Speaker
And those tests are valid science. so I'm not saying that where I like to redirect the thinking on this is. But the question that I want to answer is how saturated is the media?
00:11:16
Speaker
And to your question, the KPI to do that is marginal ROAS, and I'm not getting marginal ROAS out of an A-B test. So that that really was the first grown-up moment was to say, look, brands can run A-B b tests for free in meta, right? They were called conversion lift tests, right? that i' not going to use this frequently now because of changes in mobile tracking and things like that. But that was in the, you know, that was...
00:11:39
Speaker
That was our competitor, if you will, was free. So we felt like we needed to design an experiment approach that was made for paid media that thought solely about paid media. And that turns out has paid off against it.
00:11:54
Speaker
I want to challenge you a little bit on it because one of the things that we've struggled with, and I think many businesses struggle with, is that a change in spend, to equate that to a change in revenue,
00:12:13
Speaker
you need to make predictions around what that customer will be worth over the long time. And what we've had to do is incorporate predicted LTV models for our different product categories in order to back into a marginal ROAS, which inherently change and adjust over time.
00:12:37
Speaker
So part of the reason that we're doing a remodel, for example, is to update our predicted lifetime value models to be more accurate, to more accurately reflect this person got a credit card. We think that they're worth X over the next two years.
00:12:55
Speaker
And I wonder when you're making a model on top of a model, if there are challenges around that and how you can kind of think about that. You know, what we're looking at is, hey, this marginal ROAS says it's five, but it might be more like seven because our PLTV model doesn't have these new assumptions baked into it, right?
00:13:22
Speaker
Yeah. Yeah, yeah, no. Well, by the way, just as a data scientist, anytime we have a model with a model, right, you compound the errors. It is the world we live in. And then, yeah, the, in businesses that are, and to me, I think a lot about the two, I actually divide the marketing problem into two types of businesses that marketing is trying to support.
00:13:44
Speaker
Some of these businesses, and there might be industry names that are better than the labels I give here. So apologies, but I think of this transactional marketers or lifetime value marketers. And absolutely, uh, the, they're, they're, they actually, they actually have quite different economics, inside of their business.
00:14:05
Speaker
In theory, ah for a lifetime value market, I don't think I'm stating any controversial something you don't know. You're modeling those two events, acquisition and churn. And. And then you may, you know, done a lot of churn modeling in my career and done it with what are called survival models.
00:14:20
Speaker
Actually in your case, you could get multiple products. So you have something even more complex, which is called an intensity function. ah Where you're trying to predict the coming and going of products. It just lets your mind kind of bend a little bit when you think about it.
00:14:35
Speaker
versus the transactional marketers where we just never know if they're going to come back or not. And you really are, you're modeling the first transaction and and the lift lab framework separately from the subsequent transactions because the incrementality is not equal between them.
00:14:50
Speaker
Just a parking lot. Maybe we'll come back to it for the transactional marketers. let The reason I distinguish between them is I see marketers conflate them quite often. And so it's, and I've seen this, sorry to simplify, but someone worked at a kind of subscription business and they walked around talking about CAC and they walked around talking about LTV and they bring that over to a transactional marketer when they get a new job.
00:15:14
Speaker
Right. Or, you know and then they use those metrics, but those metrics actually can hurt a transactional marketer. they They, if you're really saying, well, you know, it's really hard for transactional marketer. Quite a lot of them have a large one and done problem.
00:15:30
Speaker
All right. So to say my CAC is $80 and then I'm just going to kind of say, and then on average, I'm going to get, you know, whatever that is, $200 over a lifetime. What they're missing in that assumption is it's going to require paid media to get them from 80 to 200, right? the There's no reason I have to buy another sweater from you ever because I can buy it from ah almost an infinite number of sources.
00:15:53
Speaker
And so paid media is going to help that. Over in your case, where you are more of a lifetime marketer, you yeah it is appropriate to calculate your predicted lifetime value And now let's bring this way back to your question.
00:16:07
Speaker
You can, by the way, change your predicted lifetime value, at least if it's modeling and not say I've changed the consumer response functions that are part of my model. So I would kind of say specific to your question and in the weeds, but maybe there's someone listening that gets you know into this level of weeds as well. Refreshing your revenue data and rescoring the data is kind of to me an independent function of I need to re actually revisit the consumer response functions.
00:16:32
Speaker
ah So maybe it's not quite as much overhead and there to, to at the first blush answer to what you're saying. Wanting to recognize is that if you start to land markedly different consumer behaviors that will have an impact on the consumer response options. I think that's, think that's kind of the nature of your question.
00:16:51
Speaker
And uh, we have a fairly active and aggressive, uh, cadence of remodeling that some of our other remodeling that happens. is automated.
00:17:01
Speaker
Uh, the ones where we're looking at consumer response functions is pretty frequent. And, there's in, in a business like yours, we look at the business case of doing it even more frequently.
00:17:13
Speaker
I'll be excited to see what you see at actual change here in the coming days.

Transitioning from CAC to ROI Focus

00:17:19
Speaker
And let's go look at the data and let it help take. I'm excited about it as well. And you know I'm excited about the the shift that we've made from CAC-focused organization to a ah ROI revenue-focused organization.
00:17:35
Speaker
It hasn't come without challenges, of course. And there's still a lot of education required, even at the C-suite level, to to talk about that especially as goals get set around around CAC, but then move towards ah ROI, it is it is a little bit challenging.
00:17:56
Speaker
And one of the one of the things that I would love to get your your thoughts on as well there is we, and many advertisers probably that work with the Flab, get the results from your model.
00:18:10
Speaker
And then there are things that they like and there's things that they don't like. And there are, and what I mean by that is, you know what, we have a conviction that Facebook is probably driving this marginal ROAS and yeah, we want to increase it, but you're telling me that TV is driving this marginal ROAS and you know what, we just don't buy it.
00:18:33
Speaker
And if you were acting like you were a compute completely objective decision maker and you got the results from LiftLab, you would optimize your mix in one way. You would do what the model told you.
00:18:48
Speaker
But then there is the human opinion, perspective, philosophy, whatever that then goes in another direction. And one of the things that I've been challenged by is still believing that like brand spend, for example, and certain executions need to be done, even though the model does not indicate that they should be done.
00:19:13
Speaker
um And I'm just wondering how you how you think about that balance. Right. Like that that constant fight that we're playing between here's what the data is telling me to to

Balancing Data Models and Human Intuition

00:19:24
Speaker
do. And here's kind of what I believe to be right.
00:19:27
Speaker
Wow. So there's a couple there's a couple recurring themes, I think, in the question. Let's get the tactical one out of the way. You can have low, low lift on something like branded search or affiliate or something like that at a high price point for your product.
00:19:42
Speaker
And the answer can come back by those clicks every time you can get a chance to. So, you know, the, the model may revise the lift down considerably, but it may still be in your economic interest to to be a buyer of all that traffic.
00:19:55
Speaker
So this I just wanted you to know, like the shape of these answers is pretty dependent on, you know, the economics of the, you know, of the particular advertiser. The, I used to tell the same story you're asking about traditional mixed models, right?
00:20:08
Speaker
Where you've done all the work, you've, you know, tamed the data to the extent that you can, you've kind of looked under every rock. And you're taking a marketing team through the results for the first time.
00:20:21
Speaker
And they're like, channel A, that looks cool. Channel B, that's a little higher than I thought. Channel C, whoa, wait a minute. I would call it, I can't get there, right? Like what's wrong with your model?
00:20:33
Speaker
And if channel C is wrong, maybe I want to revisit my opinion about channel A and channel B, Yeah. So I've seen that dynamic play out in boardrooms and, you know, you know before.
00:20:43
Speaker
And so it's always been our you know, ah mantra ah to actually lead with these are the areas where the model struggled with the data that you provided to train in the model.
00:20:57
Speaker
So let's have ah a really strong dose of honesty. Let's raise our hands and say, we could have told the model, start with a row as of two. And with the data set that you provided, it would always come back with the answer to, and if I told it start with a three, it'll come back with a three. There's no signal in this data.
00:21:17
Speaker
Right. And being honest and transparent about that has helped lift flag kind of, uh, game, like we're, we're, we're beating the team to the concern ah by airing it ourselves. It's kind of like the little kid that, you know, you have a lamp behind you, like, you know, knock the lamp down. It's a lot better to come and say, I knocked the lamp down. That punishment's a lot different than trying to hide it.
00:21:37
Speaker
And then flagging those channels as the ones to prioritize where we're going to run our first incrementality and diminishing returns experiments, right? So if the evidence isn't in the media plan data, then we need to go generate the evidence.
00:21:52
Speaker
And that's the synonym that we use for experimentation. It's really setting up marketers to go get testimony, if you want to call that evidence, right? Expert testimony from from their meta account or from their Google account or TikTok account.
00:22:06
Speaker
Right. That's really what they're doing is graduating from here's the data I got. How far can you take it to? And if we answer, we couldn't get, we, we, we tortured the data. This is actually a data at the, I didn't coin this. This is a data science term, our expression.
00:22:21
Speaker
I've tortured the data all that I can and I've made it speak, but I don't know if I believe what it said. Right. And so being honest about those conditions and saying, we're actually going to help you remediate the data of by running experiments has really been the sweet spot between these two pieces of analytics. um Let me ask you another question on on the measurement side that that I'm thinking about now.
00:22:43
Speaker
Obviously, medium mix models generally, they they like signal, spend impressions, clicks, volatility, variability over a long time period. and And so, you know, if you are an advertiser running a multi-channel mix and you are trying to think about how to diversify your upper funnel, for example, and I'm talking about the upper funnel because that one's a little bit more challenging.
00:23:08
Speaker
There might be a CMO listening to this that is thinking about their, you know, 2026 budget. And they're saying, and they're they're saying you know We're getting a lot more search interest from ai and I think we need to invest more in organic content on Reddit.
00:23:26
Speaker
Hey, Lyft, how can we actually kind of model that, right? And that's going to be an investment that you're going to spend the dollars to maybe hire a content writer, or hire an agency, right? They're going to produce content over over a time period, and then that's going to show up.
00:23:44
Speaker
And that trade-off might be made between that thing, which has a dollar amount, but maybe unclear impressions, unclear translation to revenue, and something like TV, which is, this is how much we invested, this is how many impressions we got.
00:24:00
Speaker
clearly able to model it right and i i feel like more marketers and brands are are coming to that conclusion of hey maybe we need to invest on some of these non-paid channels but they're still looking to a media mix model to try to answer that question and i wonder how you think about that and that's very timely as of today you know Well, I do think just like I said, there's limitations to the data going presented to the mixed model. It doesn't matter who estimated the model, right?
00:24:32
Speaker
There's limitations to the data. There's also limitations to the actual model itself. It's assumption, the data that it's built on. And so i I want to be really transparent and say, there are strategic questions that you shouldn't answer with your

Limitations of Mixed Models and Role of Intuition

00:24:47
Speaker
mixed model. You're going to have to do it through your intuition, your gut, best practices, word of mouth.
00:24:53
Speaker
That doesn't go away. There's not a mixed model that's replacing, you know, smart marketers. i I really don't think of it that way. If anything, we're, know, the mixed model should be viewed as an assistant to a smart marketer, like being make better economic fundamental decisions, you know, in your media planning.
00:25:11
Speaker
But there's, to me, there's like your example of, you know, should we invest in content to try to increase traffic from a new source? You know, you don't even have any history to train that model on.
00:25:23
Speaker
So we can kind of get rule the mixed model out, right? Or maybe that the, this one's even more fun in your example, because the data is very thin. It's highly, unstationary, right? It's ramping like crazy and doubling and tripling the amount of traffic that's coming in.
00:25:38
Speaker
ah So yeah, I don't think you look to a mixed model to answer, like, it's great that you gave me a kind of like juicy question like that because the answer is a little more clear, but as you work through a continuum of questions, you know, should I double my spend on creative to try to get much better creative than I have today?
00:25:55
Speaker
Right. but Should I spend a lot of money on audiences because the ones that I have are stale or the audiences I rent from Meta or Google are kind of saturated. You know, these are questions that just traditionally don't lend themselves that well to looking at, you know, a time series of media plan data.
00:26:15
Speaker
And some of them can lend themselves to be and answered through do experimentation. ah But your hard question that you answered, I'm actually going to say you get to answer that with the best AI, you know, on the planet. You know, it's it's really the marketer ah spinning that up and and and looking at it as a strategic decision that, you That the, that, that the model is a lot more of, you know, we'll call it an economic model than a, than a statistical or data science model.
00:26:41
Speaker
I love that. My favorite parts of my job is having this kind of ah lens, at looking at my work, looking at life as an experiment. And I've taken that approach not only to my work, but also to my personal life.
00:26:54
Speaker
I think that a lot of marketers, or even you yourself taking the sabbatical, you want to experiment with different things. Do I like this? Am I interested in this? What's this problem? And it's and it's fun and and it's engaging.
00:27:05
Speaker
You know, when i when we've adopted LiftLab, and we've started to kind of incorporate it into our media mix optimization process. To your point, we have an experimentation framework where we have data from the media mix model on our existing channels. We can make bets about what to do with that.
00:27:28
Speaker
We can measure that in LiftLab. We've already spent enough on TV to know this is something we want to test. We're spending enough on TV to run an experiment. and so that is something we can execute we can see the data in medium x model we can run a geolift experiment and that's fine and then there's these smaller experiments for new channels new ideas new things that need to be encompassing our larger experimentation kind of process but maybe potentially they're not in the medium x model
00:28:00
Speaker
yet. So we have medium large bets that we're making. We're looking to the medium mix model for, but then if we're going to test a new channel like Reddit, well, we're not going to spend a million dollars on Reddit out the gate.
00:28:12
Speaker
We're not going to measure it with an MMM. So let's launch it. Let's use deterministic and let's go from there. What I like to say when you're going to test a new channel is that it's not appropriate to count on a mixed model.
00:28:25
Speaker
I don't think the mixed model is the right place to think about measuring a new channel because that, that talk track goes like, well, what's the minimum number of, you know, days of data that you want. And then. No, what for me, the, the, the checklist for new channels is go understand the channel, go run a bunch of stuff, get your creative figured out.
00:28:43
Speaker
You should fall in love with the in platform metrics first. If those are not working for you, then there's probably something wrong. Like, and and you need to actually, no one wants the diminishing return curve and the economics of a poorly optimized channel.
00:28:58
Speaker
Right. So just learn how to play on that field. And you don't need a mixed model to do that. You don't even need, you know, an incrementality test to do that. Once you are in love with the in-platform metrics, then I would say, ah do this as an incrementality test.
00:29:13
Speaker
And it becomes your decision whether you go dark in a small portion of the country, and these are nationwide campaigns that are already kind of in your, or do you flip that and say, I'm actually not only going to light this media up in a handful of markets and see what it does.
00:29:26
Speaker
So those, those are compliments of the same test design. And then what you learn from the incrementality test can, in our case, be placed directly into the model. So now I can say what I learned from the experiment.
00:29:37
Speaker
How does that play out in a whole multi-channel media plan where I'm trying to make that? And so that that's a great example of where these two pieces of analytics really do scratch one another's back.
00:29:49
Speaker
what you learn from that experiment on the new channel, those answers are going to vary a month in the future if the price of your product changes, right? Or are going to vary three months in the future if you're also running that media during cyber week, right? So the model handles that really well, whereas the experiment that was a point in time, you know, during summer just isn't really conducive to understanding some of the ah follow on questions that are going to be required of, you know, the follow questions on how do I spend on this new channel?
00:30:18
Speaker
So for us, the answer is definitely an ant. um You talk obviously a lot of marketers, founders, CMOs. It probably varies by, it definitely varies by industry, by by company stage. But I wonder, do you think that generally folks are too trigger happy to test a new channel and don't focus enough on optimizing existing or that they focus a little bit too much on existing and don't spend enough time testing the new thing?
00:30:49
Speaker
Hot take. Well, in my perfect world, they could first answer the question, how saturated are i all of my existing channels? Because if they are all saturated or have marginal ROASs less than one, then you better be looking for somewhere else to run your media. It should have already started looking, actually.
00:31:06
Speaker
right And so then if you do run that experiment right on the new channel, you get a chance to see not only should I stay in this channel, how far can I scale it?
00:31:17
Speaker
None of that, so that's my ideal world, none of that, most of these decisions are made on gut, right? You know, or herd mentality, right? Like everyone's ramping on TikTok, we need to have our presence, right?
00:31:29
Speaker
And so it's okay if in this case, not everything gets, I don't know, laid out neatly by economic decisions like I'm laying out. It's fine for folks to go get their sea legs on TikTok, maybe get some quick wins, maybe lick some wounds, and then turn their focus to what are the economics of this?
00:31:47
Speaker
it it But your general question, brands typically do this without a model. They they grew up on Instagram, let's say. Or maybe they're very bottom of funnel. They're very transactional and they grew up on Google, right?
00:32:00
Speaker
At some point, they feel the saturation of that channel. So they go to the next channel. And then they see another round of growth and that starts to saturate and they go to the next channel. And then when they've gotten to four or five channels, they start to feel that there's a lot of overlap between these channels.
00:32:13
Speaker
And that's where the question of how do I do, how do I treat this now as a portfolio? How do I balance the spend and mark to market, if you want to call it my portfolio so that I can rebalance this with all the new information that I have.
00:32:28
Speaker
And so that's the crawl, walk, run, you know, that we see. And once people are in five, six or more channels and something new comes along, I think it's, I salute people, you know, unless they dramatically understand on the media plan they already have.
00:32:42
Speaker
I salute them going and tinkering and then putting it under the scrutiny of it and of a a vinegar mentality and diminishing returns experiment. You know, there there's an element of what we do. There's the human element of what we do.
00:32:54
Speaker
And like you said, a lot of these decisions come down to there's a smart marketer and there's a tool that we leverage.

AI in Media Planning and Human Judgment

00:33:02
Speaker
There's also a lot of these thought processes and decision-making, which is fundamental, repeatable, right?
00:33:12
Speaker
And now with... AI becoming so much more prevalent in our space and companies now focusing on it.
00:33:23
Speaker
I have been thinking a lot more about what are the elements of the tasks and things that I do that can be automated without me thinking about it.
00:33:34
Speaker
And the budget making decisions are definitely an aspect of that where there it falls, the line it toes the line of, you need to have some conviction there needs to be the human there but there's also some like if this then that very simple rule-based things i wonder how you're thinking about the absolute craziness that is happening our industry right now this pivotal point that we're in and how lift lab is leveraging it too right because i know that other and mmms are starting to incorporate llms into their into their kind of software into their tooling and i'm really curious what your perspective is there
00:34:10
Speaker
Uh, thanks for asking me, you know, for me, I've never felt that the, or the experiments or the combination of the two replaces a marketer. I think that they give sound, a sound as evidence that you can buy about the economics of your partners, right?
00:34:27
Speaker
The Meta's TikToks, Googles of the world. And that those become an input into a decision-making process. But ultimately the inputs that we want to provide, we talked about it earlier on the call or things like marginal ROAS.
00:34:39
Speaker
You can ah just derive from that ROAS the profitability. And i I like to say that LiftLab does not own the P&L. And what we've observed is across a lot of marketers that quite often you may start the fiscal period out focused on profitability, but everyone throws that out the window if you're towards the end of the period and you haven't hit the growth number.
00:35:00
Speaker
And so the ability to tool teams up to walk into a conference room or a boardroom to say, I can deliver X if you give me Y. ah can deliver Z more if you give me y That is what we're excited about helping marketers to do.
00:35:18
Speaker
The role of AI in this is is to further automate any of the decisions that are happening about the economics of the media. And I, and I think of that as an AI assistant for media planning.
00:35:32
Speaker
That's our manifestation of it. And so any way we can make a marketer's job easier ah by having agents, AI agents that are running on your behalf, that are taking advantage of more volatile signals.
00:35:44
Speaker
Uh, so, you know, you could say in a use case, I had a forecast, but the price of, I don't know, non-brand search clicks just changed. How would that impact my forecast? And should I reallocate my budget? And here's the suggested reallocation.
00:35:59
Speaker
That to me is absolutely in everyone's interest. the the The vendors should be pushing themselves to create that kind of software. Marketers should be opening to take it it to putting it into their workflow and depending on it.
00:36:12
Speaker
But I don't know. Let's contrast that with a much more strategic variable. Like we're being conquested by Amazon on these keywords. How do we want to react to that?
00:36:24
Speaker
The, is there some way, how did you get that information? Maybe an agent founder, maybe not, but like that variable is not in your model, at at least to the example that I'm making up. And we are still going to need the best AI on the planet to go react to that. So to me, it's a hybrid.
00:36:40
Speaker
I don't, in the, I mean, I've been, and whatever generation of it, I've been working with AI all my career. We've had lots of discussions about the man versus machine. I'm as excited as anybody about this round of AI, the generative AI round of, and and its impacts are definitely accelerated.
00:36:57
Speaker
So it, for me, it's the ah design of how do we put that in and augment a person, make them superhuman. but still not let them let their guard down to making the kind of tough calls that an LLM can't be trained on, right? That hasn't seen the data for that type of thing.
00:37:16
Speaker
Sure. Yeah, I'm realizing that, you know, it's it's going to be easier to answer questions, but marketers are going to need to get better and better at asking them. And so a big part of what we're doing is developing a very comprehensive prompt library to be able to identify those questions and also have those questions documented in a way where people across our team can leverage them because you know the most senior marketer might know how to ask a question about paid search competition, but maybe the most junior marketer doesn't. And so just having that backlog ah super helpful.
00:37:54
Speaker
I have a few rapid fire if you would not mind indulging me. And it's been a great conversation, by the way, John. First one, favorite book all marketers should read. Uh, I'd say market response models by Dominic Hanson. So I, I, he is an advisor of ours. He wrote the textbook on media mix models.
00:38:14
Speaker
It is a textbook and it's a little thick. So I'm saying this with a smile on my face. um But if you want to know where all of the theory came from for mixed models, that's the book that does it. but So I'd have to say, will I make a lot of friends with my answer? There's probably some people that start that book and put it down.
00:38:33
Speaker
And there other people that will read it cover to cover, ah but it should be on your bookshelf. Okay. Definitely and not Friday evening reading, but something to dig into. Awesome. Second one, biggest MMM myth you'd like to debunk?
00:38:49
Speaker
Uh, that we can download all of our spend data out of ad platforms, stick it in a regression model and magic will happen. You know, the, the, the, the number of models that but actually you asked that question earlier, you know, that, that, that says, Hey, i'm I'm walking through a series of results and one of these just doesn't pass the sniff test.
00:39:14
Speaker
The most naive models will. it but it doesn't take much to to kind of debunk one right if you only spent on tv in your lowest part of the year and your model makes the conclusion that i can increase the accuracy of the model by saying tv made sales go down that model should never be shared right uh so that to me It's, it's that, that, that the, that you can do everything with data science or AI and not have a fundamental understanding of behavioral marketing and behavioral economics.
00:39:49
Speaker
And so when we say the word model, we, we need an umbrella over all of that. And, and it's, they' it's kind of fun because as long as I've been in this field of, of, of data science, in AI, ah we've always said that Uh, the very last mile is estimating the model, right? Presenting the data to the model and estimating the

Reputation as a Key Asset

00:40:09
Speaker
model. And so much of this comes from what I think I'd like to refer to as problem formulation.
00:40:16
Speaker
And so, yeah, for me, debunking MMMs are people who think they can skip problem formulation because they're going to be sorely surprised. Awesome. Last question, and this is probably a ah deep one as well, but the the best career advice that that you've personally received, do you have an idea of, you think of some of the best career advice that you've received?
00:40:39
Speaker
And if you want to share it from who that was, that would be interesting as well. Well, it's gonna hard for me to remember who said it because I i think it was whispered to me in just different shapes and passions, but I'm gonna answer this. I think it applies to anybody's career, but most definitely as an entrepreneur, and that is your reputation as your only asset.
00:40:59
Speaker
You need to be just treating everyone the way that you would want to be treated or the way that they want to be treated and really deliver, deliver, deliver, deliver. deliver And if if you do that, there's a compounding effect that comes along with it.
00:41:17
Speaker
And if you don't do that, there's a negative compounding effect that comes along with it. So i I don't think you can get far from that advice to say your reputation is your only asset. Cool. John, thank you so much for being on the show.
00:41:30
Speaker
All right. Hey, Paul, thanks for all the questions.