Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
How Twigeo Finds Hidden Profits for Fortune 500 Companies and Startups Alike | Kalle and Eric image

How Twigeo Finds Hidden Profits for Fortune 500 Companies and Startups Alike | Kalle and Eric

S1 E16 · The Efficient Spend Podcast
Avatar
39 Plays11 months ago

SUBSCRIBE TO LEARN FROM PAID MARKETING EXPERTS 🔔  

The Efficient Spend Podcast helps start-ups turn media spend into revenue. Learn how the world's top marketers are managing their media mix to drive growth!   

In this episode of the Efficient Spend Podcast, Kalle Mobeck and Eric Qvennerstedt of Twigeo share their insights on optimizing marketing spend through Geo lift testing. They discuss the challenges and benefits of this method, especially after recent privacy changes. They provide real-world examples of how it has helped clients improve their advertising strategies and insights on the importance of client trust and the impact of accurate measurement on driving revenue growth.  

About the Host: Paul is a paid marketing leader with 7+ years of experience optimizing marketing spend at venture-backed startups. He's driven over $100 million in revenue through paid media and is passionate about helping startups deploy marketing dollars to drive growth.  

About the Guests: Kalle Mobeck is a marketing expert with over 10 years of experience in optimizing marketing spend at Twigeo. He has successfully driven substantial revenue growth for clients through innovative strategies and precise measurement techniques like Geo lift testing.  

Eric Qvennerstedt is a seasoned marketing professional with over 7 years of experience in optimizing digital marketing strategies at Twigeo. He has a strong track record of enhancing client growth through data-driven approaches and advanced measurement techniques like Geo lift testing.  

VISIT OUR WEBSITE: https://www.efficientspend.com/

CONNECT WITH PAUL: https://www.linkedin.com/in/paulkovalski/

CONNECT WITH KALLE: https://www.linkedin.com/in/kallemobeck/

CONNECT WITH ERIC: https://www.linkedin.com/in/eric-qvennerstedt-93612787/

EPISODE LINKS:
https://twigeo.com/  
https://www.invoca.com/blog/media-mix-modeling
https://facebookincubator.github.io/GeoLift/
https://research.google/pubs/measuring-ad-effectiveness-using-geo-experiments/

Recommended
Transcript

Introduction to Geolift and Data Quality

00:00:00
Speaker
Around these geolift studies, you usually see that they're not even close to as incremental or valuable as what they look like a based on tracking. High quality data, right? So you need to make sure that you you actually have that and you that you collect it in the right way. So what breakdown? Should it be SIP codes? Should it be cities, DMAs, states? It sounds simple, but when you actually get down to it, it is a little bit trickier than than you maybe first think. We need to create more demand. We cannot only be here, otherwise sooner or later the place is actually put full of high demand. It's either going to be they can to be smaller, smaller, and smaller, and sooner or later it's going to dry out.
00:00:39
Speaker
I think it's super interesting. I think we're entering this phase of like brand performance as well. like Will that actually be a turn that growth marketeers are going to embrace?

Meet the Guests: Kalei and Eric from Twigio

00:01:00
Speaker
Welcome to the Efficient Spend podcast where we help marketers turn media spend into revenue. My guests today are the folks from Twigio, Kalei and Eric. Guys, thanks for being here. thanks Thanks for having us. For sure. I think it would be helpful to get started just by each of you giving a very brief introduction into your experience with optimizing marketing spend and more specifically in geolift testing, which is going to be the topic of conversation today. Kalei, maybe you could go first. Yeah, but I think it's in some way this is like full circle for me. can I get started my career in social media monitoring, which at the moment was this new thing that brands did not want to do and try to avoid. Then I did in-sensor marketing, like data driven. So I actually focused on the data and the results.
00:01:52
Speaker
And since I joined Twiggy one and a half years ago, I've been a part of the UX team together with Eric. So I work on the commercial end when it comes to our UX offering. yeah I'm Eric, and I have six plus years experience in the kind of digital marketing space, and more specifically the app marketing space, I would say, and have been working across a broad range of different analytics roles. I work with brands like Dolingo and Peloton, and today I'm leading kind of all Twigio's measurement efforts, where obviously Geolift is a big part of of that mix.
00:02:33
Speaker
For sure. um And when we look at Twigio as a brand, if you wouldn't mind just giving a brief overview of your client mix, maybe more specifically the folks that are leveraging geolift testing as well.

Twigio's History and Expertise

00:02:49
Speaker
Yeah, it was a great question. Like Twigio, we're celebrating 10 years, like this year. So we've been around for a while and started off as a growth agency. And after a couple of years, quite niche into mobile subscription apps. and also bit bit of
00:03:09
Speaker
e and like We started off the geo-experiment and program with a couple of selected clients. We realized, wow, this is a big unlocker. We do this almost for all of our clients. We also have geo-lift-only clients. Yeah, also kind of adding to that as well, I would say like our, so as Calisado clients, like Roaster is quite broad, but we see it's like for Geolift, it's very, very popular among our kind of app clients, especially our iOS app clients, and also of course the e-commerce clients.
00:03:47
Speaker
Yes, I think that geolift and more broadly incrementality testing has become very popular as a result of privacy changes and will only be more relevant. Why geolift and maybe why geoli in comparison in comparison to some other incrementality or measurement tests that you could potentially lean into? Why did you make a big bet into geolift? Our journey into all of this incrementality kind of started around Apple's ATT.
00:04:25
Speaker
And when they enforce that with with scan network and all of those things. So before that happened, we we kind of saw that, hey, there's a lot of changes to how things are are tracked or are going to be tracked. So we realized that we need kind of more ways to properly and accurately analyze our performance across the channels. And we looked into things like MMM, these first party live solutions, things like post purchase surveys, multi-touch attribution, and of course, Geolift. All of these different tools have a lot of pros and cons. and But for us, Geolift was ah but a good match for us as an agency and to solve our like the big pain points our clients had.

Digital Channels and Measurement Challenges

00:05:10
Speaker
In terms of the channels that you currently work with, I know that you are a marketing partner with TikTok. um What channels can you run Geolift on at Wigio? So as you said, like we mainly work with the big ones, so that or the giant ones, right? So it's Meta, it's Google, it's TikTok, Snapchat, the big ones, right? So and for all of those, we are able and are running Geolift studies for for all of those, across all those channels. is Has there been conversation around opening this up into other channels like TV or Out of Home or other things like that? Is there a strategic reason why you focused on kind of these larger digital channels?
00:05:56
Speaker
I would say it's mainly just because that's where our clients spend most of their money. We have run Connected TV, a few tests there with quite good success. and There's technically nothing stopping us from running. It's for out-of-home TV or and anything like that. i mean The whole approach, we're probably going to talk about it more later, but the whole jiggle-lift store, that approach started out measuring out-of-home TV and things like that. So so it's just I guess it's quite funny that now the digital space is kind of adopting these old methods.
00:06:29
Speaker
Right. I think part of the reason for that is that historically, these out of home TV, other channels have been harder to measure. Now, digital is becoming harder to measure. But not only that, I think that there's more skepticism around the measurement that we do get in the digital channels. And a lot of marketers have this experience where they run their first incrementality test and they're like, oh, wait, This is what Facebook told us. And this lift test says something completely different. Do you have that experience a lot with with clients that you work with? Yes, definitely. I mean, especially when you and run some less incremental channels, if we put it like that, and something that might be heavily over attributing based on last touch, like a brand, I mean, Google brand search is like the biggest one, same with Apple search ads.
00:07:29
Speaker
branded those are usually a big driver if you look at UTM based tracking or the last touch. And then when you run these do you live studies, you usually see that they're not even close to as incremental or valuable as they, yeah, they're set or they look like a based on tracking. Do you have advertisers that try to run a geo-lift test with your methodology and then compare that to the incrementality results of ah different ah test designs? So for example, they may run you know like a pre-post test in another channel or they may use a ah conversion lift study in Facebook and then they say, this is what Facebook said,
00:08:14
Speaker
and this is what you're saying, how do you kind of talk through and communicate that where maybe a marketer at high at a high level has an understanding of like, this is an incrementality test, but it's a lot more complex than that. Yeah. i mean the If we start with the pre-post comparison, i mean that we deal with that a lot for a few of our clients who i mean they they launch a channel and they see an uplift on ah across like total conversions, and which is very easy to show to your i mean CMO, CFO, whatever. whatever like hey
00:08:50
Speaker
sales increased by this when we launched this channel. The problem with that approach is that it's missing so many important ingredients. I mean, what if you during that period was went viral on TikTok or something like that, then that like will not be true. the true but i You will just overestimate the value of that channel. right so And that is part of it. We have like seasonality. There's a lot of things under the hood that isn't really captured by these pre-post tests. I think, ah but yes, I agree with you. And ah I've also seen um some advertisers think about running these incrementality tests you know one time, ah once a quarter, or even you know a few times a year, and then using those results as the hard truth of what the incrementality is in that channel, and then applying that as a multiplier into the attributed data
00:09:49
Speaker
which I actually think that there's some challenges with that. um But I wonder from your perspective, how do you see geolift testing and the results from the geolift testing that you conduct fitting into a larger media mix optimization process whether that be informing media mix modeling or you know adding multipliers on like what you're seeing in your last touch or attributed data.

Integrating Geolift into Media Mix Optimization

00:10:22
Speaker
Yeah, I mean, very good and a relevant question for for our local clients. So i mean to be completely transparent, some of them just do the multiplier approach that you mentioned. That is good enough for them, and they are happy with that. And if they do like if they run a quarterly or bi-monthly test, that is enough enough time period for them to to take that assumption and just put a multiplier on on the the last touch numbers. are more savvy clients or I guess clients with with more resources who maybe has an internal and MMM or something like that, they it can leverage these geolift results and add it as usually as priors or some type of way to calibrate their MMM, which means that they can kind of use a test and then like, yeah, it's more useful because they have the MMM that also is running maybe monthly, biweekly, whatever cadence they have set there.
00:11:18
Speaker
Sure. Um, are you able to share any specific clients when you think of someone one that's a little bit more sophisticated and what their setup looks like? Sure. I mean, we have, we're working with, uh, we've done a research test with majority, uh, who has to set up. So that will be one, one I would call out to does that. Um, we also have, uh, had a client in, india in the, in the, and in Europe, where we actually did their MMM as well. And there we obviously calibrated that, so that's a big brand, that like a kind of Netflix of the EU, which calls VIA Play. And they there we basically did run continuous geolift studies and feed that into the MMM, which was, ah I mean, that was a very efficient way to allocate your budget in a good way.
00:12:05
Speaker
Sure. um yeah i think if you If you think about you know kind of the spectrum of brands with their measurement setups, I actually don't think that smaller startups that are just getting started really need to get too sophisticated, especially when they're not spending as much and basically anything that they spend is going to show a left, right? Like if I if i have a brand and I'm starting tomorrow and I spend 10,000 bucks on Facebook, um that's going to be incremental because nobody knows about me. But when Coca-Cola or McDonald's or
00:12:45
Speaker
you know Airbnb, that conversation of incrementality becomes much more important. um I guess if you take a look at your client mix, I'm sure it runs the gamut, where marginally do you think there is the most value in adopting a geolift testing methodology or process? Yeah, m i mean I think you hit the nail on the head there. like If you're a small small client, you have one, maybe two channels, you will ah you will know kind of the incrementality just by by looking at your blended or total numbers. So what we're seeing is usually when you go from maybe two two like from two channels to three, that's kind of where it starts to become interesting and valuable to see that relationship.
00:13:40
Speaker
and And spend levels, we i mean our we never I think we'd never run a successful test below 5K US dollars. And that would be an absolute minimum. But if you're not able to spend like $10,000 on a monthly basis on a channel, I don't think it's like it's not i don't think it's worth it, to be honest. Are you able to share a maybe a few or maybe one specific example of a brand

Client Success Story: Heatonist

00:14:08
Speaker
that has done some measurement stuff, worked with you to solve a specific challenge when it comes to understanding their incrementality and then what those results were and how that optimiz informed and allowed them to optimize their media mix?
00:14:26
Speaker
Yeah, but up I think I can add a little bit to that. like We have several clients, but I think a really interesting one is Heatonist. I think they're mainly famous for a lot of people, for and being on the Hot Ones, the show where they ask spicy questions to celebrities. We started working with them when provake it was like farmers markets to where they are today. So we've been a part of the journey for quite some time. And I think what we've done together, as we came, we started off as Eric said, we want two shovels, and then it wasn't a big thing, but then we added the shovel, the shovel, and the shovel. And then we've been looking into everything from, we look at the metal, we look the TV, if we look at Google search, and now like last week, an incrementality test for within an awareness campaign on YouTube. at the And it was actually incremental. It was like, wow.
00:15:28
Speaker
This is actually who can sell by doing awareness campaigns on YouTube. I think they have been a really good case. I think it comes down to that also with clients that they need to trust the process or trust the method that this activates the source of truth. and I think since we worked with them for a long time, they are 100% with, okay, this is it. This is the method we should leverage in order to be comfortable enough with putting more money or more spend or investing in a new channel, which is super fun to work with one of those types of clients that immediately say, this is something we should do more of.
00:16:13
Speaker
When a client um is more hesitant to buy in or when a client is maybe somebody that's, you know, a growth marketer gets this stuff, but then has to sell that up to the executive team and tell them, hey, listen, the data that we've been seeing for the past couple of years and last touch is not telling the full story. um What is your perspective on how do you communicate that effectively to give folks that might not have as much expertise in this stuff, confidence in the data and the results? It's a good question. I think I come from a background working with like all of big brands like Dolby, Samsung, Puma, and so on. I think it depends a little bit on like the marketing
00:17:05
Speaker
person that you're speaking to like how to open to this because i would say some companies not gonna drop the names now but they're like really really big me just on the level of mcdonalds that you mentioned where is more being like, oh we forgot to have the song kind of c call to action i think we spent two hundred thousand dollars on this. ah It was a success. We were going to call it a success. We were talking about like people quite high high off the funnel. sort think In those cases, it's going to be ah quite tough. i think it needs to If you're going to go to somewhere that they feel like, okay, it might be open to this, I think you need to simplify it. yeah Because I would say like in this niche of people that I have already discovered, event last week and I thought it was quite good. Imagine like that it it' like i'm probably bit like like over a co your boss was saying like imagine it's a football game. How many passes do you have to make like before you actually make the touchdown? Do you think it was only the person who did the touchdown that was responsible
00:18:22
Speaker
ah for creating the goal. I think this is at least a good way of getting it in there. But then I think also again, like show deficiency and show that you can do more things. If you're going to pitch a CFO, if you're going to pitch a CFO show like, this is how much money we could save. You could save it now. and But problem with that is the marketeer, you do not want to lose your budget. And if you do the savings part, you might lose some part of your budget. For sure. I'm working on a paper right now, and I am trying to define a ah ah new term ah that is taking the concept of product marketing fit, which is creating a product that satisfies a specific demand and aligning that to paid marketing. And I'm calling that paid marketing fit.
00:19:19
Speaker
the idea being that you should align your marketing spend to demand. And what that means is the first dollars that you spend should be on the highest demand audience. And then as you scale your budget, you actually go into low demand audiences. This is different than a full funnel marketing strategy, which would say, Let's build some awareness, then let's build some intent, then let's build let's convert them. And my perspective is demand is this thing that's naturally occurring um in the world that for a given product or service, there are people that are going to be high intent and ready to to purchase. And so you want to spend your first dollars on those.

Communicating Geolift Complexity to Clients

00:20:03
Speaker
The question then becomes what happens when you hit diminishing returns and where do you go, right?
00:20:08
Speaker
There's only a certain amount of high intent audience. um It might be a larger high intent audience for Apple than you know some very small business, but it's ah it's finite, right? The idea, though, is that as you are operating an at-scale media mix, you don't want to just continue to spend all of your money in these performance channels. You're going to hit diminishing returns. It's not going to be effective for you. And I think geolift testing is a way to normalize some of this stuff because it might be that um
00:20:42
Speaker
An additional $100,000 spent in a reach or awareness campaign or out of home campaign is going to find more incremental new users and be more incremental than just adding $100,000 in paid search. The paid search CPA might look better, but it's not actually finding you new users. I totally agree. that and I can talk about the industry here also, that it's kind of like getting caught up or trapped in. like We don't need to create demand. This is something that we tackle quite often and have to go in and talk about. like We need to create more demand. We cannot only be here, otherwise, sooner or later, it's full of high demand.
00:21:27
Speaker
It's either going to be they kind of be smaller smaller and smaller and sooner later it's gonna dry out as i think it's super interesting i think we're entering this phase of like brand four months as well like will that actually be a term that growth marketeers are going to raise i don't know but i hope that at least.
00:21:52
Speaker
Yeah, it's the I think it's the first time I've heard it put like that. all but i mean I can see our clients, they kind of went went down that road. to ah When they come to us, they have a lot of money to spend. We help them on these more performance, like the mad-ass, the Googles, etc. And we always hit this point right where things are not... like We can't really spend more than that because we're just hitting a wall, basically. And that's when we you need to go with the... kind of yeah I start running these awareness channels, usually YouTube, yeah some awareness on meta, could be TV, connected TV, TikTok, I guess it's somewhere in between there. and so
00:22:30
Speaker
m When we run geolift studies, we look at exactly that, I would say. so what kind of work Because you will never see these awareness channels look good with like conventional right? so that The ad networks will not capture the value of their own awareness impact in a sense. What we look at is, of course, if it impacts direct sales. But we also look at how, for example, does the geolift impact efficiency on things like Google search or retargeting or things like that? right So we usually have started to not only look at the direct effect, but indirect effect on other channels as well, which yeah you always need to do.
00:23:15
Speaker
For sure. um and you I think and it's in these situations where you know you can look at data, but then you kind of have to take a step inward and look at yourself and your own consumer behavior and how you purchase products. You're not seeing one ad and then converting. like but doesn't That's not how things happen. I give an example of all birds' shoes. I've known about all birds for years. I've seen thousands of ads on TV, on Facebook, on Google.
00:23:47
Speaker
and um I bought a couple pairs over the past couple years, right? But like when I bought that pair, if I'd seen the Facebook ad last, they would have converted, you know, attributed all of that to to Facebook, which um is not necessarily the best way to to go about things. Taking a transition now. I want to dive a little bit deeper into your particular methodology.

Addressing Client Challenges in Geolift Testing

00:24:11
Speaker
So, geolift testing and it at a high level, there's a bunch of different ways to conduct this, right? um There's a couple of publishers that have their own geolift testing capabilities, like Google has GoX.
00:24:25
Speaker
But you can also even run this without any partner just by doing something in-house and saying, you know we're going to find these DMAs that are correlated. We're going to run um increases in these specific areas and do this whole test design and then look at the results. um I know one of the unique differentiators of what y'all do is the synthetic control method. um I wonder at a high level if you can kind of explain, well maybe actually before we get into synthetic control method, um what are some of the key challenges that you see with clients when starting up a GeoLift test? um and And how do you kind of approach that?
00:25:06
Speaker
Right, so and we had a few clients who came to us after they tried this themselves, or at least designed the test themselves, and wanted to round things by us to make sure it was a good setup. I think mean the first thing is, of course, high-quality data, right? So you need to make sure that you you actually have that and you that you collected in the right way. so For example, i mean the the basis of GLF testing, of course, is that you group users by location. so The question is, of course, okay what breakdown? Should it be SIP codes? Should it be cities, DMAs, states? In the US, that's the way we have international leads. It's even more complicated in some cases. so
00:25:51
Speaker
That is like one thing that you need to think about and how that translates to the actual ad network you're going to run it on. and So I think it's very so it sounds simple, but when you actually get down to it, it is a little bit trickier than than you maybe first think. One of the things that we kind of struggled with, and I'll give some context. So I've conducted a number of geoleftest ah geo
00:26:20
Speaker
One way to to do this is to look at certain conversion volume and to try to find DMAs that are correlated with each other, that move in similar ways, which we've done. um And then what we did was we said, okay, ah DMAA showed this lift quarter quarter over quarter. DMAB showed this lift. DMAC, the control group, showed this lift. What was the difference between those two and then compare the difference but between target and control to get a ah lift number? Simple enough, right? What that ignores though, it ignores a couple of things, but one of the main challenges for me is
00:27:07
Speaker
how do you pick the right DMAs? So for example, what we did, we conducted a more sophisticated analysis this year or this quarter, and it's harder to do because you don't have DMA level spend in in all of your channels and things like that, right? And like with iOS, with app network channels, it's harder to get. But what we did is we tried to get to a CAC by DMA, um Advertisers can look at a ROAS, DMA, or whatever, some sort of KPI. and And then what we're saying is, let's pick the DMAs that have the best performance. like I don't want to increase in New York if my CAC is really high there. I'd rather pick ah this other DMA to to test in.
00:27:52
Speaker
um How do you think about incorporating not only like performance data into a geolift test to know that you're targeting the right DMAs, but then also to make sure that those DMAs are correlated and that you're going to have you know good test design?

Synthetic Control Groups and Machine Learning

00:28:09
Speaker
Right. At Tweedio, we built our own kind of engine or or solution. so What we do is we run we take the client's data by console conversion data, if that installs revenue, whatever, by location, and look at exactly what you say. We look at correlations. We run that through our kind of machine learning engine to look at correlations.
00:28:31
Speaker
and to to make sure that yeah the other two kind of groups are the same. right So it's New York and LA, similar to a combination of and know Chicago, Washington, or or Miami. right um So that's like step one. And when we create these correlations, we but obviously make need to make sure that they are stable over time. So what we do is that we do cross-validation to make sure that it's not just a fluke, that these ah places are correlated, but we can actually predict that they will behave the same in the future on unseen data. So that's that's more that the math part of it, I guess.
00:29:10
Speaker
and Then we also look at kind of what you alluded to. We also need to go into ad networks when we can to make sure that has Meta, for example, actually spent money in these places where we want to spend and money because otherwise, if we like if we do these exclusions and we don't spend there, it's useless to run the test. So we need to make sure that Meta previously has seen at least some type of efficiency in these kind of areas. Sure. And what are your thoughts on this, um, decisioning on DMAs based on not only what met is doing, but like holistically what performance looks like in a given DMA and then also what market penetration looks like in a DMA. Cause that's a big thing as well. We talk about diminishing returns, right? Um, and maybe that's part of the correlation, but like, you know, uh, maybe New York,
00:30:06
Speaker
and Austin moved together. But in New York, we've already spent a lot of money, and we're at a high volume, and we've we're close to diminishing returns, and Austin's a smaller DMA that we haven't spent as much in. How do you account for that? I would i would usually say that we do not account for that too much, and to be honest. What we do is that we want the big enough number of the amaze to make sure that its it is not as impactful if that happens. and it is i mean It is a factor and usually what we do is like we and also we trust the data to to show these things. like It will tell us if they're correlated or not over a long period of time.
00:30:46
Speaker
And of course, what we do is that when we run these test designs, we get a lot of different options that are has the best statistical and probability of being a good test. We obviously run this by our marketing team and the client's marketing team. but And it's very common and that like these very big areas like New York, LA, clients usually have opinions about those. So in some cases, we exclude them, or in some cases, we force them to be included in the tests. and so It's a lot more i guess marketing intuition that is combined with the data. How do you think about running a targeted geotest where you are saying, I'm going to spend incremental budget in this area versus we want to reduce? and I'll give you a specific example. right
00:31:37
Speaker
Many brands have a media mix where they have a certain amount of budget allocated towards experimental channels new tests things like that. um Which i think you can run geolift and you can say i want to increase and this is how we're gonna do that. But then there's also the level of, we know we have to reduce X amount this quarter, right? For seasonal brands, they might you know spend a lot in Q4, but then they have to cut down in Q1. How do you think about applying Geolift to that? Do you do a lot of exclusion tests where, for example, you know we know that Q1 is going to be slower, so we're going to exclude some DMAs in page search while also reducing and get a sense of incrementality that way?
00:32:23
Speaker
Yeah, absolutely. So that is, I would say maybe maybe a third of our tests are like that. And it's very common when we do that and what you said in seasonality, but usually when when we want to test kind of their core channels. So let's say meta is their biggest channel. It's very dangerous for them as a business to exclude half of half of the areas to target. i mean they they see if like The leadership will say, no way. like We can't risk that business impact. right so that's kind of from That's how we started doing this, what we call inverse tests. so Instead of launching something in ah in places, you exclude marketing in a few areas. and The good thing with that is that you only need to do that. in like You can exclude a smaller audience. so
00:33:11
Speaker
Usually, if you do a classic geolift study, maybe you need to run in, let's say, 60%, run nothing in 40. With this inverse approach, we've been able to get that down to 10%. So, you basically run in 90% of that US, and you only pause in 10%. and so that's Yeah, it's been very popular. Yeah, that's so that's a good question actually. um you know where what What our approach has been has been more like, you know we have national and then when we're doing these geolift tests, we'll pick, out of the 200 DMAs, we'll pick four to five and maybe we'll pick one DMA per.
00:33:49
Speaker
um The DMA tests or the lift tests that you're doing, I guess they're a little bit more broader, like they're kind of either splitting your traffic by, splitting your spend by 50% or more. Is that what that looks like? If we do a classic one, yes, for sure. It's easier to get a good test, just statistically, if you have a more balanced kind of test and control group, because you have more places to shoot from, right? When you create these synthetic control groups. But of course, we need to actually take into consideration that it's marketing we're doing, we're impacting the business. So it's always a business versus science question there. Sure. Okay. So so the idea is essentially like the the larger
00:34:34
Speaker
um The broader the location that you're targeting or excluding, the less there is these um small variances of we might be better in this DMA versus that DMA or you know this city versus that city. um you're trying to I guess, even out balance out that stuff. Exactly. For for example, like we have a ah classic let's say we do in the US, we have DMAs, I think it's around 200 DMAs in the US.
00:35:08
Speaker
and so it's It's going to be so much easier for our kind of method to find a good test and control group if they have, let's say, 100 DMAs to choose from. and So, what the synthetic control group is that all of these 100 control DMAs are included in the holdouts, but they have different weightings which are based on this cross-validation, et cetera, et cetera. it's just that a fancy machine learning or regression analysis that just creates a model out of all of these DMAs, and we weight them based on creating the best model, basically. And the data that's going into that is ah basically like a brand's old own conversion, like and unattributed data. Correct. The total to see how conversion behavior looks like across different places. Sure.
00:36:00
Speaker
I think that's ah that can be contrasted with but what we're kind of looking at, um which I think is interesting for for advertisers to think about because some of the things we're talking we're thinking about is like, how do we you know How do we spend enough in a given DMA to get the results that we want? right like What is the impression volume or what is the frequency that we have to hit in a given DMA to to get lift? But do you have any perspective on like that more micro kind of testing? you know
00:36:41
Speaker
When we design the test, we are kind of our our and machine learning algorithm as strategy, that it gives us like this is the minimum budget and you need to spend based on estimates of youa let's let's say return on ad spend or cost per install or whatever the metric would be. And that is based on how stable your data is. So if let's say we have a perfect model where New York and LA is perfectly modeled by some other states. That means that you can spend very little money and to actually see the incrementality, whereas if you have more noisy data, then you will need to spend more to get a good read on the true incrementality. Do you have any examples of clients that have run these tests that have made very bold ah changes as a result of what they saw?
00:37:35
Speaker
Yeah, I can go into that a little bit. so For example, the client that we mentioned, the big streaming service, VIAplate, we helped them with both M&M, with the GeoLifts. They were quite early into looking into is tiktok incremental It's is that a channel that we should bet more on and think we we started off small so in that but in a big country in europe called poland next to germany where we did this geolift test rationally so like wow tick tock is a really good channel for us.
00:38:15
Speaker
And they became like our biggest spender on TikTok after that, because they saw the power of the channel. If you compare what they saw in the TikTok ads platform compared to what they saw with our yield, it was like night and day. So I think we've seen that.

Case Study Insights and Budget Reallocation

00:38:38
Speaker
Now we also have another client can share. if we And I don't know what they're going to do with this. but where we actually saw that like the the comparison U-Lift compared Google, like a non-brand search with Meta, and we figured out like that Google, like it's not ROI positive. but so It's kind of and microsoftque is a big, hefty budget yet that they will need to switch somewhere else. and
00:39:07
Speaker
ah you know like How will they be able to switch it and still scale it? Where could it put it instead? But if they're not getting, as you said, your new method, there is no use continuing on. So it would be super interesting to see where will they place their budgets instead because I find it's hard for them to continue on down this road. Sure. You know, it's it's funny. um I can come up with hypotheses for why both of those results occurred. On the TikTok side, I'm assuming it's going to be an app client. They were running app install campaigns and then they're seeing poor results in scan, um crazy CPAs, and then then they run an incrementality test and they're like, oh, okay.
00:39:56
Speaker
um Wow scans and measurement methodology is very limiting. I think that's something you hear about a lot and been a struggle for some of the spend that I've managed where the results on iOS look poor. you know after iOS 14.5, but you have a conviction that they are driving ah things and GeoLift validates that. On the paid search side, non-brand paid search is kind of historically seen as like very incremental.
00:40:29
Speaker
high intent, people are searching for the thing. um But depending on the size of the brand, I think that my feeling would be the larger a brand, the less incremental non-brand page searches because folks already know about you. So if you are the biggest credit card in the world and someone's searching for credit cards in the click on your ad, they probably already know about you. um But yeah, it's funny, like there's um there's these kind of like common sense things that these tests can uncover. And then it's like the like the ocean moment of okay, yeah, right. That makes sense. um But there is like a belief system too, right? Like they're just because the performance marketer might
00:41:16
Speaker
believe that like there needs to be belief in the CMO or whoever's making those like large budget decisions. um Because what do you do when you know you run an incrementality test, you see non-brand paid searches incremental, that's 30% of your budget. Now you have a forecast for the next year that was assuming you were to get X amount of conversions for that ah spend. like How do you kind of act

Aligning Teams and Future Media Mix Modeling

00:41:41
Speaker
on that? you know Yeah, for sure. i mean we We have a few examples where some of our clients, like they we run one test and they like they this can't be true. They run another one and realize, okay, shit, please but okay we need to believe this now because all the evidence is pointing in that direction.
00:41:57
Speaker
Isn't that a thing as well, like with the tests also, that you kind of figure out and you can unite on what the actual truth also is? Because I think like when you look at the data and you look and some person might look into Metas and like vta a platforms, Google Analytics it might be Shopify and so on. I think that's a problem in in general, that not everybody is aligned. And finally, maybe after head sex and liketic two tests, you find like, okay, this is probably the truth. We should probably believe in this. I think it also needs to be alignment in the team with what do we consider to be the truth.
00:42:42
Speaker
For sure. As much as you could say that there is truth in data, um human beings are still optimizing the media mix these days. Might not be in like that in five years, like maybe, you know, there's no marketers and it's just some sort of AI doing all this stuff. But for now, we have results and then we have human beings interpreting their results. So as important as the test design and the structure and all that stuff is,
00:43:15
Speaker
It's even more important on making sure that the people decisioning on it understand it, have conviction in it, um and and make improvements. Just kind of wrapping up here, talking about where we're headed in the in the future. um you know Google just released their playbook on marketing measurement. um I think media mix modeling continues to be more popular. From your perspective, from Twigio, what are the you know product enhancements you're doing? How are you kind of responding to the next three to five years in terms of innovation and media mix optimization?
00:44:00
Speaker
Sure. I mean, we we're obviously strong believers in geolift testing or other experiments. So that one could also be the first like channel, first party testing. But that is like obviously not enough. So it's all about how do we bring everything together. So if if that's an MMM, if that's some type of modeling, and we strongly believe in stealing post-purchase service as well. I think it's an underutilized method to to to to merge all of those sources. And I think a blend of them is going to be the way to to get as close to the truth as possible. Because so I mean and worked with over six years trying to to get to the truth. And it's i mean it's always a moving target. You will never know exactly what the value is of a channel. I think the good part of this, looking back, I remember like 10 years ago, like when it came out, when the view was one second, I'd think, what? Was it only a second? It was kind of something we've been lied to for years. Now you actually have to pursue, but like it's it's all the tricky part because there's no source of truth, which it kind of was before, but you need to pursue something that ah is more accurate
00:45:19
Speaker
energy in it. But what I hope we'll hopefully see out from that is that there will be better bets from marketing teams based on that leverage. This test will help them to future-proof their measurement. So now I'm looking forward to it. Then I think like and like Will there be some smart AI system or any AI tools that will be deployed in this? How is it that pays how things are developing? Yes. If you ask me what will happen in five years in that space, don't ask me. It's too hard to predict. Sure.
00:46:06
Speaker
Last question, and and either one of y'all can answer it. This is the Efficient Spend podcast, right? So if you can maybe take a... you know second to think about your own clients that you're working with with Twigio, what's the most efficient spend that you've seen? And what's the most inefficient spend that you've seen? At least

Ad Spend Efficiency and Closing Remarks

00:46:31
Speaker
inefficient. It's it's a tie, I think, between Google and Google brand. And weird enough, some type of mid funnel campaigns I've seen on meta being
00:46:42
Speaker
wordly useless. and so that that's i mean That was the biggest shock to me, I would say. and Most efficient spend has been probably on TikTok, just because it's so it's so different from what you think it is versus what it actually is. Cool. um Well, Kyle, Eric, thank you so much for being on the the podcast today. Where can folks learn about you? And so like you can go into twigyo.com. You can learn more more about us or U-Lift services and so on. If you want to connect with me on LinkedIn, here we go. Swedish name, Kalimovic. It will be a link somewhere. K-A-L-L-E-M-O-B-E-C-K. And connect there. I think this is the best way to get in touch with me. Eric, do you want to plug in something? No, that's fine. Okay.
00:47:38
Speaker
connect with me and I will connect it with Eric. Cool. Thank you so much, guys. Thank you. Thank you.