Overcoming Perceptual Barriers to Adopting New Methods
00:00:00
Speaker
I noticed that the main blocker of adapting this, it's not math, not tools, not ah any other rational reasons, more like a perception, beliefs and philosophical reasons. So decision makers, most of them don't understand how all of this works.
00:00:16
Speaker
So that that's, that's, that's all be my primary goal in my new venture, Incumensality Insider. Well, help decision makers to make sense of it, simplify it. And eventually, when they tackle this, they they will win in business because there's so much money being wasted.
Introducing Talget: A Leader in Measurement and Optimization
00:00:39
Speaker
Talget, welcome to the show. Hi. Thank you for having me. I'm excited to chat all things incrementality, measurement, and budget optimization with you today. You're a leading expert in the space.
00:00:52
Speaker
To kick things off, I wanted to contextualize your experience for folks. Could you just give a quick introduction into who you are and what your career journey has looked like from an incrementality and budget optimization perspective?
Talget's Unconventional Career Journey
00:01:07
Speaker
Sure. I think my background is... I'd unconventional in terms of I didn't have a prior knowledge and expertise and sort of like build up I need to deal with.
00:01:17
Speaker
So I immigrated to US back in 2015 and by random chance of luck landed at Google as a contractor in 2017 and strike in the heart of the area of experimentation, incrementality, causal measurement.
00:01:34
Speaker
I started as a, in the GTA role, but slowly start actually uncover the passion to it and how fascinating this topic and how important it is in decision making, not only related to marketing budgets overall, I think essence of the, any effective decision making comes back to causality, like cause and effect. That's what we actually trying to solve, right? Like,
00:01:58
Speaker
with different means, different words, but essentially you need to find out what works for your objectives. And and this is it actually ah can be solved with the causal measurement approach.
Google's Measurement Innovations: Geo-Based and Time-Based Regression
00:02:08
Speaker
So most of it was first introduction by Google of geo experimentation as a framework, the new approach to measurement. First paper came out, not first, but influential one.
00:02:22
Speaker
widely adopted called the GVR, Geo-based regression, a very, would say complicated most of for the day-to-day market year, like ah it's a ah multi multi-linear regression and it's very complex to implement. It has also has its own ah drawbacks. I'm not going to go into details of it, but the next paper came out, time-based regression, which is much simpler broke using the Bayesian framework.
00:02:46
Speaker
And what helped me that but within the Google, I was on a part of the as a measurement lead marketing science, helping advertisers to measure effectively and scientifically the investment in Google media channels.
00:03:01
Speaker
And what several years I spent only on that and and that's how it's become my thing. So, and just summarizing the experience that run probably more than like hundreds of studies, different shape and forms.
00:03:16
Speaker
educated teams and was in charge for the project, Project Lift, it's a cross-function initiative across internal Google teams, helping analytical leads to learn and help advertisers to properly measure increments out.
Talget's Experience at Amazon DSP and TikTok
00:03:33
Speaker
And just skipping a few steps after Google, I moved to Amazon DSP as a research scientist, even though I don't have any background. I didn't even have a U.S. education. It's just, again, serious of luck, connection, and actually being passionate about it helped me to get there.
00:03:52
Speaker
So worked there. It was also interesting experience. Exposed to programmatic world. Very confusing, very, i would say, blotted, and and very, very complex world of open internet on programmatic. And after that, moved to TikTok.
00:04:08
Speaker
where I was more on the PMM side and helping helping the internally coordinate work between measurement teams and product teams. All in all came to the point last year where actually i felt that I know enough to do
Founding Incumensality Insider: Simplifying Incrementality
00:04:26
Speaker
it on my own. i I don't need any corporate structures for it. And also most of my, let's say, this is my thing all the way.
00:04:35
Speaker
I did it for the nine years and I'm planning to do it for the like rest of my life because because it's so much can be done and so much there's so many gaps and so many like ah areas which not being properly explained, understood two different reasons.
00:04:52
Speaker
So and I noticed that the main blocker of adapting this. It's not math, not tools, not to not to any other rational reasons. It's more like a perception, beliefs and philosophical reasons. So decision makers, most of them don't understand how all this works.
00:05:13
Speaker
So that that's, that's will be my primary goal in my new venture, Incumentality Insider, where help decision makers to make sense of it, simplify, and eventually When they tackle this, they they will win in business because there's so much in money being wasted the
Why Advertisers Struggle with Measurement Complexities
00:05:30
Speaker
means. So this is my intro. That's awesome. And I think i I share a similar obsession with you behind understanding what's driving results, what's driving performance, because when you run a multi-channel media mix at scale, you know that
00:05:50
Speaker
there is spend getting wasted basically at all times and identifying it and finding the areas to scale finding the ah efficient spend is super rewarding and gratifying why do you think that so many advertisers and marketing leaders still struggle with this even today think there is a One phenomenal, like a problem, industry-wide problem, I was trying to, like, I tried to internalize and synthesize and put the structure and zoom out and and come up with some frameworks, because what's happening nowadays, there's a different platform, there's mini a universe, they they have their own incentive and agenda.
00:06:38
Speaker
And the the whole area of advertisers, they are confused because there's so many misleading... Most of them. And what happened that back in the days when the first introduction of the personal computer and then smartphone, it was a relatively simple world where attribution was solved. Attribution being used as as the representation of the truth.
00:07:06
Speaker
So most of attribution dashboard perceived as a, as a two performance. And then over time, things got really complex. There are so many devices, so many like most of the of the world garden.
00:07:20
Speaker
And as a user, where you come in and out and nobody, absolutely nobody, even Google have a full visibility of the users across whole internet. And it creates the, this huge of silent world of the mini universes. It like.
00:07:36
Speaker
universe of universes, and then you could come in and out. and And back in the day when everything was simple, the promise of the one user, one device, identification,
The Misconception of Attribution as True Performance
00:07:46
Speaker
et cetera, was easy. And then drastically increased the cost of the measurement because measureing Measurement is expensive, but since there's a rise of performance marketing, which kind of pushed back the brand marketing, that's a different story. And like, there's so many problems, but long story short, experts, marketers in the field over the years working on this, they they didn't like adapt it to the new world.
00:08:14
Speaker
at scale, you there is some, of course, adapting, but the majority is still thinking the attribution, it is the representation of the reality. But what, what, what ends up happening, there's so many mini universes, so many like a book of records, databases, and the problem came out that the the reconciliation, they're trying to get a sense of it. And then that's why with iOS release of the privacy and overall trend of the privacy of the users across the board,
00:08:44
Speaker
led to the situation that, okay, this this is not the reliable, this is not working, but not all the company moved away from that conceptual understanding that this is not a representation of the reality.
00:08:55
Speaker
So that's why the holistic approaches as MMM And geo-experimentation, which also have more like holistic approach, which was you using the as a KPI top line, not attributed to the filter of someone's logic of attribution.
00:09:10
Speaker
That's why like there is a huge, huge misconception, misunderstanding. And the main problem, there is there is no investment behind it to address this issue at scale.
The Need for a New Measurement Doctrine
00:09:21
Speaker
but yeah There is a rise of the measurements house. And I see that ah Renaissance of MMM faster, cheaper, but but but that's kind of tooling side. that That's great.
00:09:31
Speaker
but But without addressing the overall conceptual understanding, it's I see the, like, I know a lot of companies and I have a friends and colleagues like huge network in the area. They have em have a hard time to adapt it because fundamentally it needs to be This will be a new, like a doctrine, new thinking should came in and it should come and then explain it. Look, this is this is a different world and whatever you did for the last decade, you have some build up knowledge and filter.
00:10:01
Speaker
It's not relevant anymore. And it's really hard. for people in the Eastern to admit, because that's become part of their belief and part of their identity. And they cannot move away easily from that saying like, oh, like all those years I was doing it wrong. So that that's the biggest problem. And for decision makers, which will be my primary focus, they confuse even further, like they even more because there's so many conflict conflicting truths.
Challenges in Outsourcing Judgment in Measurement
00:10:28
Speaker
There's so many like ah sources of truth and they become so confused. they They have a different... day to day, like a objective for the brain space, like, or they dealing with the funding, the, the burden rate, hiring, and then this coming, this like, they don't have a brain capacity to like, ah properly understand. And they, they make a big mistake. They outsource the judgment.
00:10:51
Speaker
They, I like to keep this example, like all this high stakes conversations happening, either measurement vendor or like test results. And my man comes into the table and they say, okay, what's going on? There's some presenting. And another problem that our colleagues from the field, they over complicated most of like ah too much, you know, like they come into the state decision-making and it starts saying like Confident intervals, p-values, this and that. There's so many unnecessary complexity of the wording and decision makers, they don't get it. And they they try to alter the judgment. they They call in the like most of these data scientists.
00:11:31
Speaker
can you Can you explain this to me? Like what's going on? ki Is it like, I trust you you you in our team. Can you tell me? Yes or no, good or bad, how bad it is. And and and those folks usually they they focused on the different things and then this thrown at them and they're like, oh, like it's ah it's a second priority and they don't have expertise and they do something quick ah to appease the stakeholders and what quality is not there. So like long story short, lots of problems.
00:12:01
Speaker
huge vacuum of the expertise and big misunderstanding that world's changed so much. Attribution is not ah equal to true performance.
00:12:12
Speaker
Right. And of course, as a founder or CEO, you are solving multiple problems. This might be on the lower end of your list. And so the approach, of course, is to outsource it in a lot of ways.
Media Effectiveness vs. Event-Based Optimization: A Dichotomy
00:12:28
Speaker
Similarly with AI, a lot of CEOs have extreme FOMO. And so they say, I want to make sure I'm doing everything right. I don't want to be missing out on this on this big trend.
00:12:42
Speaker
I want to get into your approach into measurement and incrementality testing, but I believe that there's still a delta between measuring media effectiveness and bidding and optimizing media.
00:12:58
Speaker
Because if there was no concept of bidding and and optimizing towards a specific event, it would be a lot easier for us to just say, let's have a media mix model.
00:13:12
Speaker
Let's do incrementality testing. That's going to be our approach for optimizing our mix. The reason why we still have last click attribution and we still have you know deterministic and probabilistic attribution is because we need an event to send to the ad platforms to optimize towards.
00:13:35
Speaker
And so you have all of these different ah forms of marketing measurement at a high level. Incrementality are kind of a way to experiment, but you still need data to flow into the ad networks for them, for their platforms to know what to go after. So how do you make sense or how do you think about that kind of dichotomy?
00:13:59
Speaker
Really good question. And I think so, but it implies that platforms, when you share the conversion signal, optimizing for incremental users, which is not always the truth because my understanding on
The Attribution Challenge: Over and Under-Attribution Issues
00:14:15
Speaker
high level. So each platform is an over attributing or under attribute.
00:14:20
Speaker
And depending on the like scale is different and ultimate goal should be oster when you combine all those claimed reporting of attribution, it should match to your actual sales, so your CRM or whatever, but actually like cashflow. So, so, but it's not, you know, that for sure that it's not happening.
00:14:40
Speaker
that There is a lot of like ah double counting inflated metrics, et cetera. So some platforms you cannot and treat equally in terms of the attribution most attribution aggressiveness of of the gar of claiming the credit.
00:14:56
Speaker
So some of them, you feed them more data, they conveniently allocated close to the final decision of the checkout. And and and um objective is not ah change the behavior of the user, but just conveniently piggyback when the user are already heading to convert. So, and that's that's the problem of the causality because again, there's the there's deviation, big deviation happened.
00:15:23
Speaker
Most of the media to change the user behavior to cause the desired action. or solving for attribution because attribution, it's a different concept. It's like the the purpose of attribution and that way they where they overlap back in the, let's say, 90s, beginning 2000, where attribution was solving the causality and and and and they're displacing it and it was like ah reliable enough, but it's not the case anymore.
00:15:50
Speaker
So and I'm not sure if all the platforms perform better if you upload your conversion data like ah my Most of this should be, I think, carefully evaluated case by case, because in some cases, you kicking hurt it hurts your business. recordor And that again, this is a very complicated topic, and I cannot say a universal answer.
00:16:12
Speaker
But I've seen cases where plot that with business decided not to share any conversion data at all. They just... optimized for reaching frequency and and just solving for each. And then they see through incrementality studies that just doing the like a broader reach, giving the more incremental users. And and yeah, like again, case by case.
00:16:35
Speaker
Of course. Yeah, it's something that I think about and play around with a lot because when you are bidding towards a specific conversion event, the ad platform will show impressions to those that are converting. But if you have a strong natural conversion behavior,
00:16:57
Speaker
Those ads are being served to people that would convert anyway. And if you optimize away from i need to get as many purchases as possible to something else, that's kind of a proxy for showing impressions to people that you might that might not be in your I'm about to convert pool that is actually an incremental conversion. And I'm exploring this all the time.
00:17:28
Speaker
I can share quickly just to close this loop.
Impact of Meta's Algorithm Change on Advertising
00:17:32
Speaker
ah I can share a really interesting case where i uncovered interesting phenomena. So I think what happened in Meta in February, the huge algorithm change,
00:17:45
Speaker
I dig deeper for that into this case and I i found the two concepts which is actually hurting some advertisers. Lots of advertising getting hurt by this new update. And I think any decision a maker who looking for the agency, they need to clearly divide the expertise and cases and and then developed the company most sites before and after 2025 February.
00:18:13
Speaker
This new update changed the game completely. And what's happened that maybe you notice as user itself, that when you as a user see the ad on Instagram and then you engage or watch or even click.
00:18:27
Speaker
And most of it triggers the whole change of event of like, it's called ads clustering. So now you see lot of ads in the same category, like ah a lot of them, like before you didn't show the interest intent signal. Now you're getting bombarded with be like any type of product and service. You just show little interest.
00:18:45
Speaker
That's it. Like you being bombarded and you get exposed to the a lot of their competitors with the same category, which you didn't exist before, or you didn't know existed. And the the way how it's feeling, the really advanced algorithm, which is, you know, the social game changed completely.
00:19:01
Speaker
It was social graph. There was a knowledge graph, an interest graph. So you in organic content, when you engage in certain topics, then really like, very fancy machine learning AI algorithm, start to match like this behavior to certain advertisers. And then like it's all feeding this loop.
00:19:18
Speaker
And what happened eventually, there's this concept called scouting. Scouting and then piggybacking. So the advertisers in category who spend the most it It ended up educating the whole like a cost of user base and then others who are piggybacking on them, they get it free right and exposure.
00:19:41
Speaker
So what's happening? I saw the huge case that advertisers spend most educating the whole category about the whole category of the whole huge pool of users because algorithm finding very aggressively the users who interested in this topic and when your ads are very generic and uncovering the whole category, like you don't, as a user, you don't know this existed, this offering is in product or services. Now you show the interest that the scouts, those who found you paid the most, but the whole category like a riding behind of it.
00:20:12
Speaker
And what's ended up happening, the top spender, waste most of the money. But the whole category of small guys is rising and their awareness rising and their and their sales rising because the the one guy being the full overspending. So that's a different topic. But yeah, I'll share more in my resource.
00:20:34
Speaker
Talgit, what is your framework or philosophy for uncovering and improving wasted and inefficient ad spend?
Frameworks for Identifying Wasted Ad Spend
00:20:47
Speaker
I developed several, like my own frameworks and I was really busy all this year implementing, but now I have a time to put it in the, like a concept of structure and playbooks, which will be in my resource.
00:21:00
Speaker
And by the way, I decided not to do the newsletter. There's so many newsletters and world doesn't need another newsletter. So I, I'm going to, I myself, like, like I signed up for so many of them and I'm not even offering some of them. Even, you know, world is very noisy. There's so much to like form or to read, but I decided to make a resource, which is not a newsletter. It will be, I call it incrementality intelligent desk.
00:21:27
Speaker
So it will be the resource for decision makers, marketeers to come in and there will be clear, simple playbooks and guides to implement and follow step by step.
00:21:37
Speaker
So most of action, no, no water, because I understand that when you, as a solopreneur, for example, you're trying to figure out what your value, like, of course, newsletter is good stuff, but for me, it's all clear. I don't need to like a...
00:21:52
Speaker
figure out my value. I know what I'm good at and I need to just focus on the pure delivery. And then the most important part that is to change the mindset. That's why I'm addressing this and putting lots of effort to educate decision makers that it starts from the beliefs and it starts with the culture. Because no matter what you do with the tools, if incentives not there, look, you probably are aware that the different different stakeholders may have different true intentions and true ah objective. but they They don't share it publicly, but internally, you know, like it comes to politics.
00:22:28
Speaker
It comes to agency who get a cut from the media budgets they are in charge of. And they're not interested for the efficiency. They're interested for continue increasing the budget because they get a cut.
00:22:40
Speaker
So that that's like misalignment of and incentives. That's the key. And I think the decision makers, they're not aware how much money being wasted and how it can be prevented. And there's ways to do it like a,
00:22:53
Speaker
I'm going to share the seven day sprint, like how to do certain steps to stop the waste. Like, like go in the platform, turn off this stupid things, which not supposed to be there. There are so many, like a,
00:23:06
Speaker
buttons and knobs and every other one is so confusing. and And then beliefs, perception, and this is it where I'm introducing the, which I was able to put in in the talks, like what's the path of the getting better at measurement?
The Measurement Elevation Atlas: From Sleepwalkers to OmniSight
00:23:21
Speaker
It's not like everybody says we need to get better at measurement. What what does it mean?
00:23:25
Speaker
And I see from my like experience from both sides, there is a path and there is a a level to it. So I call it measurement elevation atlas. and And the level one is the, I call them sleepwalkers.
00:23:40
Speaker
It's not their fault. For the reason I described that before, because of buildup and misconception and all this huge gap in the expertise, I see that 70, 80% of the advertisers, they're in that stage of sleepwalking. They're not aware they're wasting money because they accepting the different truths of the dashboards.
00:24:01
Speaker
The next level, i level two, it's awakening. They start questioning, like, okay, How, how do we know is it true? Like we need to test this. We need to like get to the actual hard evidence and we need to make most of have the courage to admit like what we don't understand and what we understand what what works.
00:24:22
Speaker
And there is like a but it's the biggest struggle, but in the same time, biggest efficiency gains happening there. The next level it's enlightened. This ah already kind of figured this out.
00:24:33
Speaker
Now they addressing the waste. And after, when you clean up your system, your system is like most likely bloated. Between 30 to 40% of your budget wasted.
00:24:45
Speaker
And once you clean up the fist system, then you have a better, like a signal, you reduce the noise and and you have a better sensitivity for signal.
00:24:56
Speaker
And then you will find the nonlinear growth opportunity on channels. And as you know, the growth is combination of factors. different proportion of different channels can give you the best ROI possible. But when you have a lot of load in it, it's hard to that even as a MMM practitioner, they know that All those evaluation little um MMM will be much, much better when you clean up your system of the non incremental spec, like MAPE then R squared and all this, like it will show, okay, now now you you have a better prediction, higher accuracy and of this is for enlightening. Then I see that good trends.
00:25:34
Speaker
Companies, there is organizational level, there is a personal level. They're moving from the stages. And next one, they're called OmniSight. This is like guys, like A players, killers. They're like, figure this out. You cannot bullshit them with the, with the, but with them with the, with the, some,
00:25:50
Speaker
narrative propaganda of the other like most of these from other ah stakeholders or other parties. So they they know what works. They test that they, they aggressively testing all this but test everything and and they see the test test and learn it's become their like a default.
00:26:07
Speaker
setting and on a resonational level, on the personal level, those experts exist. I see them that it's growing the population of them who they come into the company, whatever they touch, they they like super efficient. They cut the waste fast and then they help company to grow. Actually, this is a one of the most, let's say, growth unlocker as a competitive edge nowadays.
00:26:32
Speaker
having someone like this in your part in your side, like figuring this out and internally also educating and change the culture. But closing my monologue on the organizational side, companies will figure this out.
00:26:45
Speaker
And you probably are aware that this most players, they figure out to the point where they This is such a lucrative business advertising. this This works so well. Why don't we do it ourselves?
00:26:56
Speaker
And they become advertising platforms themselves. And and you see the rise of retail media network and those Ubers, Netflix, Bookings, like Expedia, like so many of them now doing their own ad network because it's such a lucrative business because The value creation is that most of so many, that of the whole like a business community confused how to measure that. That's why the the value is there. So I'll share more in my resource, but this is how I see. And the frameworks, depending on the case, closing on the frameworks, I'll say typical answer, it depends.
00:27:33
Speaker
If if like as a practitioner, I try to do less of it, but sometimes I will do as a premium offering. Most depending on appetite, how fast you want to move on the, on the, how aggressive you want to be, because it requires courage. It requires moving fast and.
00:27:50
Speaker
stopping everything what you do and addressing this issue and not i i noticed already that not all the organization are ready for it it's too fast too aggressive to update the beliefs and that's why like you need to start maybe slower like depending on appetite you can start okay let's do mmm okay we see here and there mmm have a lot of multi-coloniality and other issue let's do calibration with experiments so that the I'll say that most popular approach nowadays started with MMM calibrated incrementality and get better at it. But there is a shortcut fastest way when most of just do to multi cell experiment aggressively and within a short period of time, you tackle this really fast. It was in two too months, but you you can, but not all the companies ready for it. That's why you've got a stretch for the way two years. So it depends.
00:28:43
Speaker
Yeah, I mean, when you when you talk about organizational culture change, you're talking about behavioral change. And that takes time.
00:28:54
Speaker
And based on the level of the organization, size of the organization, the stage, it can take a very long time. We have about five minutes left. I want to move to rapid fire around a couple of kind of quick questions with what your hot takes are.
AI in Incrementality: Efficiency vs. Judgment
00:29:12
Speaker
I'd love to to hear about it.
00:29:14
Speaker
My first question, everyone's talking about AI right now and will continue to talk about ai where does AI and incrementality collide?
00:29:24
Speaker
And how should we be thinking about the relationship between those two things? Very interesting, complex question, because in my personal opinion, before tackling AI,
00:29:36
Speaker
This should come first, like understand efficiency and where you were actually wasting money before jumping in the form of AI. What AI practically can help is the educating and then automating and so and like streamlining this manual tasks related to testing and consulting. There is a lot of so operational efficiency can be accelerated with AI, but the critical part never ever in marketing outsource judgment to the AI.
00:30:06
Speaker
It should be like because AI is the trained on the past knowledge and past knowledge as you can, as you, as we, as we all know, like regarding the measurement it's a it's it's not relevant anymore. So.
00:30:20
Speaker
The training set fit with the incorrect non-reality information. If you want to apply to the business, AI cannot gonna solve this. And there are so many issues where like causality, incrementality not properly addressed in any AI.
00:30:34
Speaker
And I personally tested all those LLMs like with the hard question, they come up with something like, which will be, which Of course, it was relevant like 10 years ago, five years ago, but not now. So long story short, but when they collide, operational efficiency, education, building some training programs for the team, changing the culture and like some interactive models testing. i myself playing around for this with the educational part, but never ever, I'll never recommend to trust in AI for the high stake decision making and the measurement because it's completely wrong regarding this.
The Importance of Hypothesis in Experiments
00:31:10
Speaker
What's common experiment design mistake that you see even advanced teams making? Doing something for the sake of doing without proper hypothesis. Like everything should start with a simple hypothesis of the business objective. And the the biggest the biggest unlock when I was doing the workshop, by the way, I'm also will be offering those.
00:31:29
Speaker
Before tooling, before all this design, first, you have you need to get clear on the business objective. and then how to translate the measurable hypothesis. This is the biggest step of the like moment. like Okay, that is it measurable?
00:31:43
Speaker
Is it testable? Is it like there are feasibility? Is there before jumping right away to testing, you need to build this connection. And actually, the those professionals like Omniscite or Enlightened,
00:31:54
Speaker
they develop this intuition of the feasibility even before testing. Even before tests, they can play around with observ observ observational data and see, like, is it possible to test or not? Like, playing around with this noise and signals ratio.
00:32:07
Speaker
So that's that's like a sort of like ah approach from Art of War, Sun Tzu. don't know if you heard about this. Like a winning the winning the war before the, like winning the battle before the battle. You need to be...
00:32:20
Speaker
more certain than running blind and doing something. So like the key is the biggest mistake I see how to translate business objectives to the testable measurable hypothesis, which you you have a data supporting that is measurable. So but because I see the, some really stupid mistakes when,
00:32:38
Speaker
They jump in front of the testing without this feeling, and then they're trying to boil the ocean. like The small sample size, they're trying to measure some different hypo, which is not even possible to do. But yeah, there's so many of them, but this is a key one.
00:32:51
Speaker
Amazing. Last question. What's one mental model or book that has shaped how you make
Influence of Dr. Benjamin Hardy on Talget's Approach
00:32:59
Speaker
decisions? Most of both. of his I really love work of the Dr. Benjamin Hardy.
00:33:05
Speaker
the series of books 10x easier than 2x and and the the most recent one like impacted me so much I'm going to implement in my business and life it's called science of scaling and the key concept is that most of the time as a tool and most of you can you can change all your decision making today if you filter out to the future you desire, like a, for a desired future.
00:33:35
Speaker
And then everything becomes so clear that box you don't need to do 90% of stuff you're doing if you filter it right with your big goal you have. So that's why when I, when I, Intuitively already you kind of knew this. i When I left the big tech, I put this very delusional goal. what People told me it's delusional, but I know it's possible.
00:33:54
Speaker
The bloat and waste is so big, I want to help businesses to optimize $1 billion. dollars And it's possible. I know it's possible.
00:34:05
Speaker
and And that's sort creating the value that kind of will be by product. Like, of course, like I'll get some small cut out of it when I make it happen. But putting in the value first, helping the drive those results. Now I filter all my decisions to that one simple goal where I would optimize $1 billion. dollars And I already made some progress, like 20 million.
00:34:26
Speaker
and But I need to change direction because it's too slow. Now, that's why I'm addressing this and finding the core issue of the decision-maker perception and the beliefs. This is where I believe I'm going to change the slope of this traction and hockey stick, like ah move it faster. So scaling.com, definitely recommend to anyone because it may change your life like ah in in ah in a way you cannot imagine.
00:34:52
Speaker
Love it. Target, thank you so much for coming on the show. Thank you. Thank you for having me.