Introduction to Marketing Mix Modeling
00:00:00
Speaker
Imagine to have the same thing that you have with FinTech solutions that allow you to allocate and distribute your portfolio across each ah stock that you have in your portfolio and predict the annual yield that you're going to receive. Imagine to have the same exact things for marketing, your marketing mix. This is what a marketing mix modeling is. It's a guidance and tells you exactly what you should do. And it enables you to delegate the risk of taking a decision to the tool and analysis and everything, instead of getting the risk. And generally, generally, It's more accurate than a human because these models handle complexity better than a human brain.
Gabrielle's Background and the Birth of Cassandra
00:00:41
Speaker
Gabrielle, welcome to the show. Thank you, Paul. Thank you for inviting me. I'm really excited to chat all things and MMM today. Would love if you could give the audience just a brief background into your experience as a marketer and a founder. Definitely. So I'm Gabrielle, CEO and co-founder of Cassandra. My journey.
00:00:58
Speaker
in the marketing mix modeling world started in 2021. I founded previously a marketing agency that helps e-commerce businesses grow through performance his marketing. And in 2021, I managed 12 million euros in advertising investments for my European clients. At the time, me and my current CTO and co-founder were working on the agency and our role was to systematize decision making inside of the agency. We had to have procedures and tools to systematically improve clients' performances. And we could optimize most of it. CIO, experimentation, A-B tasks, they're easy to systematize. But there was one thing that actually blocked us. to The bigger obstacle that we saw was media allocation. At the time iOS 14, 2021, iOS 14 came out, making Google Analytics non-reli was not reliable
Challenges Post-iOS 14 and the Need for Cassandra
00:01:50
Speaker
anymore. So we had this a lot of money, a lot of responsibility.
00:01:53
Speaker
We had to maximize revenue for our clients, but we then had no clue on how to do that. We took a look online. We saw Nielsen and Kantar ah selling services that are were called market mix modeling. They were charging between 100K and 500K and required six months in order to have an output. And when I saw that, I thought, right, this is outrageous. and My client is going to churn on me if I wait six months in order to change something in their market mix. So I've decided to copy exactly what they were doing.
00:02:21
Speaker
start studying econometrics and statistical modeling, apply it on my clients, results were really positive. And then I thought, all right, this is something really valuable for clients, but only I in the agency can run this analysis. What if other marketers that don't know anything about statistic and coding need to want to use these methodologies? How can you do that? And at the time, there was no solution. So this is why we created Cassandra, a platform that enabled non-technical marketers.
00:02:51
Speaker
to analyze their marketing mix through the best practices of marketing science. So design and analyzing an incrementality test, running and building the marketing mix model, and integrating and triangulating multi-touch attribution data from advertising platform with marketing mix modeling and validate them through incrementality tests, all in one platform.
Evolution of Marketing Mix Modeling
00:03:12
Speaker
Amazing. The and MMM as ah as a tool has become much more relevant in the past couple of years. It started in the 1950s, Mad Men style days, right? Funny funny to think about that. you know There's some really old old folks out there that were using this to run you know print ads and and TV and and things like that on a completely different product set.
00:03:36
Speaker
And now the the use cases more for digital brands and things like that. If you had to kind of give an overview of the landscape today in terms of where the different categories of media mix modeling tools lie, how would you kind of present that and think about that?
Categories of MMM Solutions
00:03:55
Speaker
I believe there are three main categories in the market right now. Right now there is a right in a sense of market mix modeling because digital you all tracking system that were really easy to use are just deprecated. We need to go back to the methodologies that were more difficult and more reliable. Right now, there are three main players in the market. There is big consultancy businesses, Kantar, Nielsen, Scrupam, that do really extensive and really heavy modeling experiences. and
00:04:23
Speaker
Uh, services, they cost a lot, they have a lot of responsibility and they can go, uh, and charge these prices mostly because of their authority and their branding. The second stage in the second, uh, sector is open source libraries. They're actually disrupted the market. Markly mix modeling was something black box. And after the release of Robin and then we'll lightweight MMM and orbit and on the reading. It's not a black box anymore. Everyone can, can do and see the calculations that are behind.
00:04:50
Speaker
Now they're really good because they democratize everything. They show how algorithms make algorithms work behind that. They require statistical knowledge to be used and technical knowledge. And there is a new wave of marketing mix modeling solutions like ReCast, Mike, that you already interviewed, Cassandra, El Fortel, and other players in the market that allow marketers to receive these insights without having a technical team behind that actually runs these analysis. In the in this scenario, there are two types of companies. There are the companies that provide directly the solution. they The company like other other player build the model and then show the results through an interface and application in which you can refresh over time with one click. And there are other tools more like Cassandra that enabled them and enabled them marketing team to build the modeling apps.
00:05:45
Speaker
So have your technical to your marketing team and create, connect all the data and model in house, setting your own priors, your own calibration data, et cetera. Yeah. And it, it is interesting because the outputs of this can be very valuable to, to marker marketers. However, as you mentioned, depending on your budget, your team size, your understanding, it can be quite complex and This varies by different startup companies as well. And when i what I mean to say is that it could be brands out there that are managing media mixes at scale, seven-figure monthly budgets, but only have one data analyst on their team because they haven't had to you know um hire more um
00:06:36
Speaker
i'm We're kind of in in this boat right now where we're trying to understand the the landscape and you really need to have someone knowledgeable to to build this stuff out for you. so And one of the things too I think that that brands might be struggling with is When they see this new technology or old technology that is then being updated, they think about all the different answers that it can get them, all of the different problems that can help them solve. Media mix optimization is tricky, especially operating at scale, especially operating at you know very ah high levels of of granularity. and
00:07:18
Speaker
What I've experienced so far in starting to onboard in MMM and think about this stuff is that I have to be really specific and smart about what I can rely on an MMM for.
Common Misunderstandings About MMM
00:07:33
Speaker
And I don't want to, you know, over promise, uh, to maybe an executive team on what it can do for us. When you think about use cases for a media mix model, what are some of the things that it's very good for? And then what are some of the things that maybe people get wrong and mistakenly, uh, try to use utilize it for. ah that right We talk.
00:07:58
Speaker
an average of 30 to 60 brands per month. There are a lot of misconceptions in the market. Someone thinks that marketing mix modeling is a tool to understand how to look at money first, then how to analyze creatives, understand what creatives works best, what's the angle of the ad that works best, and understand qualitative and quantitative data simultaneously. Sadly, it's not the case. Marketing mix modeling is a quantitative decision-making tool. It's main purpose to decide how to allocate your marketing budget to maximize revenue. Obviously, it allows you to diagnose the past, understand how the ROI change, and diagnose what our diminishing returns.
00:08:40
Speaker
but it's mostly used to allocate your money and see a lift in ROI from your marketing investments. Yeah, there are a lot of misconceptions out there about what it can do, how frequently you can refresh it, how frequently you can ah run it. You cannot use it as Google Analytics. You can use it at a monthly basis or weekly basis, but you cannot use it at a hourly with hourly granularity. And there is a lot of complexity.
00:09:05
Speaker
And I agree well with your thesis. I mean, when you want to, when you are a multi-market brand and invest seven figures advertising, marketing mixed modeling is really expensive to run and resource intensive. Game that I see and the huge opportunity that I see there is not in technology, not in revolutionizing the new algorithm out there is to solve the UX problems that analysts have scaling up these methodologies. How hard it is to adopt it.
00:09:33
Speaker
how hard to it is to understand it, how hard it is to calibrate it. Cassandra, mainly, and I believe that if we can solve this gap, we could actually solve the adoption problem. Because the easier the tool is to be adopted, the easier it is to educate customers, and the easier it is to have a massive adoption in the market for something that is valuable, and everyone agrees on that too.
00:09:56
Speaker
Sure. I think that that's a that's obviously something that's very important. However, I would say that there is interest at least on on my end to understand how the the sausage is made to a certain degree, meaning like what what is the what are the differentiators between models?
00:10:13
Speaker
That's something that I wanted to talk to you a little bit about today. If we think about media mix modeling as source, that allows us to take a lot of inputs, measure an output, and then find correlation. When we think about the inputs that we're putting into the the model, one of the big things that I've seen as differentiated between a couple of different tools is the utilization of spend.
00:10:38
Speaker
as a primary input versus impressions, clicks, kind of more engagement based data.
Debate: Spend vs. Impressions in MMM
00:10:45
Speaker
Nielsen, for example, focuses very much on impressions. Their perspective is that that is more of a indicator of customer action and can find correlations that way. But then there's other models that focus more on spend. What is your perspective on this?
00:11:03
Speaker
ah This is really interesting. Also, how much you spend determines how many eyes eyeballs see your ad. And based on how many eyeballs see your ad, you have a certain conversion rate on those and you generate sales. These are the main pieces there. Now, when you use optimization algorithms though, and you want to correlate the spend to the impressions, the impressions to the, to the output that is going to predict, we found AB b testing this, that using spend has a better accuracy on forecasts.
00:11:32
Speaker
And this is, we validated it in our in-house. Then we, we talked to the meta marketing science team because we became partner one year and a half ago and they supported our thesis. We are more on the spend level. We only use spend to measure it. Obviously there are limitations. So if you include in Italy, there is ah TV ads and the CPM for TV ads changes a lot based on politics, based on seasonality and based on all people. So.
00:12:00
Speaker
What you will see is higher uncertainty on those channels that ah have more politics determining the CPM versus others that don't.
Budget Allocation for Hard-to-Measure Channels
00:12:09
Speaker
I think one of the promises of and MMM is that it allows you to measure harder to measure things. It's become popular because of the loss of measurement on digital where folks over invest Facebook, Google as a result of iOS 14.5 responding to things like cookie deprecation, which now Google is walking back on a little bit as of yesterday when we're recording this, which, which is kind of ironic. But for me, as much as some of the measurement has changed on digital, it's still good enough in terms of like operating a simplistic media mix. However, what I.
00:12:52
Speaker
struggle with, and I think what a lot of marketers struggle with when scaling a media mix is thinking through how do we deploy budget towards the harder to measure things that we know have large growth potential, influencer marketing, ah SEO, organic content. Some of these things that don't have tracking, simply simply put, don't have easy to to track things, but we know have have an impact.
00:13:21
Speaker
When you think about some of these harder to measure things from an attribution standpoint, we can use spend as as a way, but then in a lot of these areas, and I think we can actually, we should get into the to then nuances here, right?
Media and Organic Factors in MMM
00:13:38
Speaker
In a lot of these areas,
00:13:40
Speaker
this spend might be something that is averaged out over a time period. Like influencers, we might say we're going to spend $300,000 a quarter and then we just average it out on ah on a daily basis. Things like affiliate marketing. Do you take the spend when the conversion comes in or the you know when you're serving the impression, right? There's all these little nuances. So how do you think about kind of like measuring and tracking some of these non-digital areas?
00:14:10
Speaker
There are three categories of factors that you can include on average in a marketing mix model. The first one that needs media, out of media is everything that you can control in terms of how much you allocate in each media type. Affiliates are not included in those because you pay after the sale has been made. Then you have organic factors that are, you can include all the organic factors like emails and SMS sent or WhatsApp messages sent or affiliates. So how much traffic affiliates have generated for you.
00:14:40
Speaker
Then you have context factors. Context factors include internal factors like ah promotions, discounts, et cetera, and external factors like if there was COVID, if there are change drastic change in temperature, change in how much it rains, competitive sales, and so on. The tool needs to be helpful in terms of decision making about media allocation and understanding What's the incremental contribution of media with respect of other type of activities that we do in the market? For example, it needs to tell you what is the optimal, the the best discount, the maximized revenue for you. It needs to tell you how much if decisionality we're going towards is going to have a positive or negative effect. It needs to drive a list of insights that can guide you towards making better decisions.
00:15:29
Speaker
in your market mix activities. And the reason why you need to include organic factors in the media mix, in the market mix, other than just media, is because in adding those allow you to isolate the incremental contribution on contribution of organic factors. Because if you don't insert them, that incremental contribution is going to go either to baseline or to the other media factor. I'm going to show you a biased view of what's really happening in your market mix.
00:15:53
Speaker
One of the things that I've seen with a lot of these tools is that you want to see volatility and spend changes frequently, and you want to give the model a lot of changes to be able to determine kind of like correlation to the KPI that that you're looking at, which is easy enough to do with digital. You can kind of pulse spend a little bit more challenging to do with things like organic or or influencers. how do you How do you think about solving for for this problem? like
00:16:29
Speaker
and I'll give an example because we're structuring this data you know right now. We have a brand partnership that is an annual contract. So comparing the spend there, looking at it day over day, you wouldn't see a change. Month over month, you wouldn't see a change. Quarter over quarter, you wouldn't see a change. Year over year, you would see the spend change.
00:16:54
Speaker
So this less frequent spend change, is this still things that an MMM is good at kind of like identifying and then correlating? Would it be able to look at you know comparing 2023 to 2022 and then find that correlation? if If there is change, he's going to detect it. No, the problem when you have a lot of variables and release more variance in your input variables and small variance in your output variable, it means that going to have our time understanding what factors contributed the most and that much. It's going to have harder time to split the individual, kind independent contribution of each factor, and you're going to have higher uncertainty. And higher ins uncertainty are driven by, ah you can see that based on how wide the confidence interval about their ROI measurement is. And what it could happen is that the ROI confidence interval would be extremely wide when you have really, really low variance. But you have mostly that
00:17:50
Speaker
Generally though, there are always changes in the Apple variable that you can see. And based on these changes every day, every week, et cetera, you can actually assess a normal incremental contribution for each variable that you to insert in the MediaMix.
00:18:04
Speaker
Do you give a recommendation on the number of inputs or granularity that a model should be given?
Impact of Inputs and Historical Data on MMM Granularity
00:18:14
Speaker
Again, I'll i'll give a ah specific example. Some modeling tools can model down to the campaign level. There's brands that have thousands of different campaigns. So that level of granularity requires a lot of assumptions. And there's probably a lot of uncertainty there versus if you were going to, going to say, you know, something, you know, recast recast perspective on this is max 30 inputs and anything more than that is a little bit ah too complicated to model. And there's, we don't have enough confidence in that level of granularity. A sort of rule, it's not a standard value is
00:18:54
Speaker
You need to have on average 10 rows, 10 time record for each variable that you insert. This is the first rule. So if we want to add five, if you have a weekly data set and you want to add five variables, you need to have at least one year of data, which you can now model seasonality. If you want to add seasonality, you need a hundred rows, the weekly level granularity, and you can model up to 10 variables.
00:19:15
Speaker
So based on how much historical data you have, you can include variables and you can add granularity. There are additional steps though, that you you need to follow additional data requirement that you need to have, you know, to have better granularity. And we can usually start at a campaign type level granularity. For example, Google search brand versus Google search and all brand. We can split them if there are these two requirements. One of the these requirements is that you need to have low, multiple linearity with the variables that you insert.
00:19:44
Speaker
which means that they need to move individually. If they move together, if two spans move together, it doesn't make sense to split them because the model will not have, unless you calibrate it through an incrementality test, you will you will not be able to split them and understand what's the individual incremental contribution of each factor. Then you need to have at least 10 non-zero values inside of the dataset, for example.
00:20:09
Speaker
If you invest on on TV and you invest on TV just for two weeks every year, and and you have only one or two years of data, it's going to have a really hard time to assess the true incremental contribution. We need to be careful on number of variables versus number of rows, how much multicollinearity there is on the data, and how many non-zero values each variable has.
00:20:32
Speaker
in order to reach statistical significance of all the incremental measurements that we do. Does that make sense to you? Yes, for sure. And it's challenging because when you're operating at scale, you are making micro adjustments all the time that as a percentage of your overall spend might might be quite low. you know If you're spending $10 million dollars a month on media, you might be having a Facebook account manager that is making $200 budget changes. right And there's all these little micro adjustments that you're doing in the individual in the isolated level
00:21:10
Speaker
to improve performance based on what those CPAs are in that specific campaign. And I think probably marketers might i think we can rely on an MMM for this, but that's simply not the case. When I think about how we're going to be operationalizing and MMM,
00:21:28
Speaker
I think of it as a much more of a higher level channel allocation forecasting tool to inform different experiments that we want to run, but not necessarily as like ah a campaign management or like a, you know, micro budget adjustment
Strategic Use of MMM for Channel Allocation
00:21:49
Speaker
tool. But what are your, what's your perspective on, on that as far as operationalizing um MMM? I actually, I agree. It's a strategic tool.
00:21:57
Speaker
Overall, you know, there is each campaign. that Let's say you create a model with Google's Google ads. You have a variable, which is what span Facebook ad spend. Now, what you do in marketing mix modeling is you to create something called ad stock. You model what's the effect over time of your investments on Google, Google ads and Facebook ads. Now there is a big change. There is a big thing and we need to consider there. Google ads, you have different campaign types. One is Google video and Google search. These two have really different ad stock effects.
00:22:27
Speaker
One is an upper-final campaign, Google Video, and Google Search is a bottom-final campaign, which means the fact is instant. There is no delay between the moment in which I am invest and the moment in which I have the maximum output, which means that I believe at a certain point, you're going to have to split based on the final stage in which split by campaign type and by the final stage in which each campaign each spend is in in order to model properly each market mix model. And the quality of a market mix model that comes out only depends on the quality of the historical data and how independent each variable was historically. This is why for us, it doesn't make sense to run once a year or once every semester, MMM, but you need to have frequent refreshes over time in order to increase, to manually force the variance in your data and enable the model to increase robustness, decrease the uncertainty over time.
00:23:18
Speaker
Because in my, in our experience, the model is never, cannot be used right after the modeling experience. Unless you've run incrementi tests previously, you need to have three months or an hour process in order to achieve a robust model. Let's talk about that. So, yeah you know, that that's something that I think is very unique about the. Newer stage of MMMs, folks like your yourself Nielsen, for example, charges.
00:23:47
Speaker
a large amount of money for one model, which then we can pay extra for updates. But that is something that could be done on a quarterly basis, right? it's It's not too frequent. When you're talking to marketers and setting expectations, what is your perspective on the number of remodels and the process in which the tool gets smarter to then have more confidence in the results?
00:24:15
Speaker
Our onboarding experience works in this way. We have in the first month. We set up the KPI connectors. We have the data cleaned automatically, and then we create the first model that we share with the client. After we share with the client, we set the expectation on how to interpret these insights. It's really good to assess our square, our accuracy of the model, but you need to assess also the level of uncertainty and the risks that there are following these insights in real life. There are always risks. As Michael, a one post from Rika said, marketing decisions are not deterministic. They're more like bets.
00:24:48
Speaker
You make a bet to have an outcome, a positive outcome, but there is always risk next to it. And very much agree with that because you need to understand the risk associated with making decisions over each measurement that what you do. And your role, our role as a market mix modeling provider is to decrease the risk as much as much as possible.
00:25:09
Speaker
And we do that by the second month, we run an increment I test on the most uncertain channel. So in the end of the second month, we refresh the model and we use the output of the increment I test. We use gel lift, but you can use conversion lift or any other type to calibrate ah the new period clicking refresh the model refreshes and improves inaccuracy and decreases in uncertainty. We repeat this procedure for until we have.
00:25:32
Speaker
Confidence intervals, they're acceptable for the ROI measurement and the forecasts that the model runs actually match the reality with a 90% accuracy rate.
Case Studies: Success Stories with MMM
00:25:44
Speaker
Using a media mix model in conjunction with incrementality or geolift tests is something that's becoming more popularized now.
00:25:53
Speaker
That's one of Liff Lab's major value props as well. However, I do see some challenges with it. So if you think about a media mix optimization process without a media mix modeling tool available,
00:26:09
Speaker
What a marketer is going to want to do is to optimize towards conversions or revenue acquisition. And with the absence of MMM, the correct decision would be, I'm going to spend my money in the places that have the best CPAs from an attributed perspective. What that's going to lead you to do is to optimize into paid search, ah Facebook, direct response, right? Then if you add matched market testing to this where you're comparing a control and holdout group, it may still be that those lower funnel channels do the best in terms of driving towards a conversion action.
00:26:51
Speaker
However, when you're operating at scale, maybe you hit diminishing returns in those as and you do a match market test. Where I see a lot of opportunity with and MMM is to enable you to move from the lower funnel to have more confidence in upper funnel bets that have a longer ad stock effect and time horizon in which you're seeing impact. How do you think about that? I love this question. I love this topic.
00:27:19
Speaker
And the reasoning behind conversion lifts have this, uh, problem about using pixel data in order to attribute, uh, the performances of an incrementality test. Sure. If so, you can choose what depending variable to use. And we use overall sales, which means that the easiest way to validate the incremental contribution of upper final campaigns in a market mix model is to assess the real incremental effect of bottle funnel campaigns is weird. What we do, for example, the first experiment that we always run is.
00:27:48
Speaker
Let's do a holdout test on Google search brand. Let's see really how many incremental sales we generate because we're investing there. And we often see that incremental contribution rule search is way higher compared to what the platform shows us. So we stop spending on certain regions in the the market in which we're investing. And after 30 days, let's say a number, for example, ah we actually see how many what was the overall ah change in ah sales that we generated in the regions in which we stopped investing versus the regions which we continue investing. This actually allows us to have yeah the analysis on the effect here and the fact that this change had in the market mix. We often see that there is no change in overall sales over time because we stopped investing. And this allows us to calibrate the market mix modeling, putting at zero on the incremental contribution or a really small number as a calibration factor inside of the MMM.
00:28:48
Speaker
And the incremental contribution that would normally go to Google search will go to Google video or upper final campaigns with a longer ad stock. It's tough for marketers to to realize that too. If you haven't run this type of test before and then you get those results, it's shocking sometimes. And it's shocking sometimes to spend that in fact was incremental and performant. I know we have a few minutes left. I want to talk a little bit about some of the clients that you work with.
00:29:17
Speaker
and the results that they've gotten from an MMM or Cassandra, but and maybe in conjunction with an incrementality test as you spoke about, and what that led them to do in terms of making different decisions. If you want to give some specific examples, great, but also would love to understand like commonalities that that you see but within your client mix.
00:29:38
Speaker
Yeah. So for example, we just presented on ah at a meta event, a case study that we did with Gineshiko, which is a really big e-commerce slash retail in Sweden. And we helped them improve ah ROI by 52%, understanding first of all, the effect over time of influencers. For how long the investment of influencer has an effect on world sales. And then how do I capitalize all the demand that I've created with influencers?
00:30:06
Speaker
Understanding the time horizon of each campaign type in which I'm best into, they've been able to first allocate a lot of money during Black Friday and influencers, and then during December capitalize all that demand through performance marketing. And this enabled them to have not only performance-based attribution system that would allow them to do micro-optimization each day by day, but to have a strategic view of how much budget would be marketed and needed to allocate at the monthly level, and then do micro-optimization daily with attribution data.
00:30:35
Speaker
Having these, it's like a compass, right? You don't know if you need to do three meters on the left and then three meters on the right to go to your objective, but you know that the objective is north. You need to go there. So you have a path that you need to follow and some constraints around you. but Generally unlocking one other case study that we did assessing the true seasonality of ah a travel tech company that we're working with right now. They increased 80% their overall revenue year by year. What they did is they understood through the seasonality analysis how much they should have i'll allocate the marketing budget in performance marketing during the year. Understanding how the looking at the seasonality as a proxy that determines how much demand there is for your product over time, they've allocated the marketing budget in that way and then use the budget allocator of their marketing mix model to distribute the marketing budget according to that particular thesis.
00:31:29
Speaker
And that worked really well. Generally, it's not real difficult to get ah to these kind of results when using market mix modeling for the first time, because the alternative is just doing with ah a list for the clients that we work with. The alternative is not it's nothing, in general. Let's look at that influencer example. So you know operating a media mix, I have to make the decision between allocating a certain amount of spend to influencers or potentially an app network.
00:31:59
Speaker
If I look in the absence of an MMM, I see the app network. I see, I'm going to spend this amount. I have really low CPMs here because they're an app network. I'm going to drive a lot of impressions, a lot of clicks. I'm going to see some installs come in from, from scan. Maybe there's, uh, maybe I'm under-attributing there, but I have confidence that I'm going to see performance. I can bear that to influencers, specifically organic influencers.
00:32:25
Speaker
I'm going to allocate that same amount of budget. I can estimate what the impressions I'm going to serve are. or r Say it's a YouTube sponsored ad. I can see and see what that is. I can calculate CPMs. Maybe I you know put a link on the YouTube video with a GIF code and I can attribute that way.
00:32:46
Speaker
however That's going to be a route from a CPA perspective. If I'm just looking at attributed data, it's going to be much lower. The CPMs on the influencer side are going to be higher. So I'm going to pay more for less impressions and I'm going to pay more for less attributed conversions.
00:33:02
Speaker
if I'm just looking at attributed data. The media mix model output allows me to correlate that spend more ah effectively to see what the real true revenue impact would be. And so I make that bet a little bit differently. It's something I've been thinking a lot about, ah impression quality.
00:33:26
Speaker
And the fact that actually optimizing for the lowest CPM areas is not the right way to think about it.
Importance of Impression Quality in MMM
00:33:35
Speaker
Now you want a certain impression volume overall, but what that the diversity of that impression mix looks like is really interesting. Like if I'm going to trade off and I'm ah operating at scale, it might make sense for me to get less impressions that are really quality from an influencer that's going to validate my brand And it's going to result in revenue. And now that I can do that with an MMM. I agree. A hundred percent. One thing though is you need to have the right North-Southern metric to optimize for that because most marketers are evaluated based on CPA. I mean, a lot of them. And when we show them the incremental CPA or incremental ROI, it's a new metric that never seen it before. So there is a big shift that needs to be done in the market in which marketers, professional people in this pace needs to start
00:34:27
Speaker
questioning the the metrics that they've been using so far. Start considering others like incremental ROI, incremental overall sales, overall MER, or any other metric that actually can mitigate the risk of not attributing the right sales to the right channels through attributions.
00:34:47
Speaker
Right. I think healthy skepticism is is great. Awesome. I guess just just closing out here, you know if if you're you're speaking to an audience of performance marketers, what are a couple of things that you really want us to to take away from this conversation and you know think about and things that we should be thinking about as we're managing our mix?
Future Adoption and Strategic Value of MMM
00:35:16
Speaker
and'm I'm a marketer too, so I'm in that and i'm in that particular batch. and Imagine to create something to have something in your everyday workflow that can decide for you something with a higher accuracy than you do. And imagine to have the same thing that you have with FinTech solutions that allow you to allocate and distribute your portfolio across each ah stock that you have in your portfolio and predict the annual yield that you're going to receive. Imagine to have the same exact things for marketing, your market mix.
00:35:44
Speaker
This is what a marketing mix modeling is. It's a guidance and tells you exactly what you should do. And it enables you to delegate the risk of taking a decision to the tool and the analysis and everything, instead of getting the risk. And generally generally, it's more accurate than a human, because these models handle complexity better than a human brain. So marketing mix modeling is something that that will get a ah massive adoption when it becomes really easy to adopt and really easy to understand. And um I strongly believe that there is a lot of content between what you've created, what other YouTube channels have created that can actually help marketers to understand what we're talking about and implement it and see how cool it is to have a machine learning model and that actually you click a button and tells you exactly how to allocate and predict how many sales you're going to generate. Awesome. Gabrielle, thank you for being on the show. Thank you. It's been a pleasure.