
Google Ads Unleashed | Winning Strategies for E-Commerce Marketers
Welcome to "Google Ads Unleashed," the ultimate podcast for anyone who wants to harness the power of Google Ads to boost their online business. Whether you're an agency owner, E-Commerce marketer, or just someone who's interested in digital advertising, this show is for you.
In each episode, we'll dive deep into the world of Google Ads, exploring the latest strategies, techniques, and best practices for creating effective ad campaigns that deliver real results. Whether you're a seasoned pro or just getting started, you'll find plenty of valuable insights and actionable tips to take your advertising game to the next level.
We also bring in expert guests to share their insights and experiences, so you can learn from the best in the business. Our guests include successful E-Commerce entrepreneurs, marketing professionals, and Google Ads specialists who offer practical tips and advice.
With Google Ads constantly evolving, it can be hard to keep up with the latest trends and changes. That's why we're here to help. We break down complex topics into easy-to-understand language and provide actionable advice that you can implement right away.
Connect with Jeremy Young on LinkedIn for regular Google Ads updates, or email him on jeremy@younganddigital.marketing
Google Ads Unleashed | Winning Strategies for E-Commerce Marketers
The Truth About Incrementality in Google Ads – with Marcel Smal
What if your best-performing Google Ads campaigns aren’t actually adding any value?
In this episode of the Google Ads Unleashed Podcast, host Jeremy is joined by Marcel Smal of Roots Network breaks down the real meaning of incrementality in PPC—why it matters, how to test it, and what most advertisers get wrong. We explore brand search cannibalisation, PMax pitfalls, conversion lift studies, and the hidden costs of not knowing your true impact. If you want to separate performance from perception, this one’s for you.
Find Marcel here: https://www.linkedin.com/in/marcel-smal/
Get your free 30 minute strategy session with Jeremy here: https://www.younganddigital.marketing/
Scale your store with 1:1 coaching: https://www.younganddigital.marketing/1-2-1-coaching
Hello and Welcome back, guys to Google Ads unleashed. Hope everyone is doing fine. This fabulous Monday, Another week, another guest episode. I love a good guest episode, because, as my guest has remarked about five minutes ago, not only does it make you guys smarter, but it makes me smart as well, and I think you always have to keep learning. On that note, I have the pleasure of introducing Marcel small Marcel, we met a couple of weeks ago in Brighton. You're the co founder of roots network, a Dutch based PPC specialist, PPC agency, and you caught my attention with an absolutely fabulous topic, which we're going to be talking about today, and that is incrementality, specifically in the PPC context. But I don't want to take too much away. I'll leave the word to you to tell me a little bit about yourself. What's, how do you get into PPC? What's, what's your story? Marcel,
yeah, thanks for the introduction. Jeremy, yeah, I started in paid search for over 12 years already. I started at the agency, Dentsu. I prospect. I don't know if you, if you know them, they're based in
very big they, they must have hundreds of employees now, surely, yeah,
definitely, like globally, I think, yeah, 1000s. And yeah, I worked there for nine years, and I Yeah, got Yeah, so lucky that I got to learn it, I think, from some of the best in the in the field. So that's where my passion for, uh, for paid search started. I worked there for nine years, like I grew up, until the leadership of the paid search team there, and then, yeah, three years ago, together with a friend who also worked there, we started roots network, which is an agency with with a full focus on paid search. So yeah, we kind of love the difficult challenges. So so large advertisers, large budgets, complex challenges and where we can make a really big, big difference. And, yeah, we work from from full execution to just strategic work for, for in house teams. Yeah, so that's, that's, that's us. Yeah,
amazing. When did, when did you start that?
Yeah, that was, like just over three years ago that we that we started, and we're now with, with a team of seven. And it's also nice we're, like, mostly with, like, senior specialists. So it's also kind of cool to see, like, We're a small team, but we, yeah, do make a big impact. You know, like, we are able to run, like, large advertise, because we also automate a lot, yeah, with a small team, we try to have as big as an impact as possible. That's amazing.
You know what? I sometimes feel a little bit like that when you zoom out right, you kind of sometimes forget how much power, yeah, even with a small number of people, and I think that's one of the things that really keeps me going. Like, when, even when you, I don't know, work with big advertisers or even smaller ones, and make a real difference to their lives, right? With what? Yeah, which is, which is insane, yeah.
I find it really interesting. Like, when, when I started, like, 12 years ago, like, for some advertisers, you needed a team of, like, let's say, 10 people to run them, you know, like big accounts. And nowadays, with, like, if you get automation correctly, like, you just need like, one person, or like half an FTE, you know, yeah. And we've also like, taken over clients where they had, like, a very big team there, or like, a lot of people working on it, and but just not working in the in the smartest way, where, yeah, we took over, and then we automated as much as possible, and then spent, like, yeah, most of the brain work on like, the important projects, like incrementality testing. And I think that that's really cool to see, like, how you can with less people, how you can make like, a bigger a bigger impact. Basically,
less is more, quite literally, yeah,
I think that's definitely true for our field, you know, yeah,
absolutely. And you've just dropped the I bomb in there already, which on your website as well, is, you know, next to the things that you mentioned, I had a bit of a peek on the website, obviously, Google ads, measurement, etc, training, which you mentioned, but the big eye bomb, incrementality. So I'll be honest, I have this has been a topic that has been interested in me for a very long time. I think, as a result, quite literally, out of the fact that our sort of focus as marketers, in what we do has so massively changed, if you sort of think about 50 years ago, right? You would, you wouldn't have called it incrementality, but de facto, what you did is you ran a TV spot. I. Figured out, does this work there, and does it correlate with sales, right? That's what marketing was. Then there's been this absolute hyper obsession with tracking and attribution, where you know, when? Do you remember when, like, Kairos came out and stuff like that, and everyone was, Oh, my God, we're doing tracking completely wrong. And now, in the last two years, I'd say it's almost gone full circle. We've gone back to, you know, when we just run Google, or when we just run this, does it actually affect sales or not, right? So it's the more we automate and the manual tasks get taken away. It appears like this kind of stuff is getting more important. But yeah, before we get into all of that, what does it actually mean? Marcel,
yeah, yeah, to get, like, back to what you said, because I think it's, like, super interesting. Like, back in the day you did, like, Yeah, you did marketing, and then you didn't really know, like, you could find out the impact was difficult. You also have the saying, like, 50% of my ad spend is wasted, but I don't know which 50% then you had, like, online marketing, and then we were able to, like, connect, like, yeah, conversions. We were able to measure conversions, and then we got, like, super confident, like, Okay, now we know the true impact, but actually we weren't able at all to like, or we weren't able that we thought we got closer to the truth. But that's actually not the case. You know, like you measure the amount of sales after an interaction, but you don't measure the amount of sales if you would not have advertised. So I think we all thought, because we look in an interface where we see, like, the amount of, like, what we spent, the amount of clicks that we got, and the amount of sales and as and if you look at it like, you start thinking, Okay, this is the truth, you know, like, I spent like 10,000 I got like 100,000 back. Okay, great results, you know, but it's actually, it's still not the truth, because you always have to think, if I wouldn't have advertised, like, how much would I have gotten? And, yeah, that's, that's the whole incrementality question. And that's, that's like, what you what you asked in the end, like, yeah, what is incrementality that, yeah, it all boils down to that, you know, like, if we would not have advertised, like, how much sales would we still have gotten? And I think that's that's really the Holy Grail. If you just measure in Google ads, you're not gonna get that that question, like that question answered. You have to do Inc mentality testing, where you create basically a control group where people weren't exposed to your ads, and you compare that with your total amount of sales, and that is the true incremental impact of your of your efforts. And you can do that in all sorts of ways that you can create like that, that control group you want to suppress your advertising that can be based on GEOS, you know, you stop advertising in certain provinces or even an entire country or address level, or based on user level. You want to create a control group. You suppress the ads. You compare that where you had your ads running, and then you calculate the difference, and that is the true incremental impact of your of your efforts,
yeah, so in the sort of context of paid search. So what could that specifically mean? Because, for instance, there's a lot of chat, let's say So, generally speaking, incrementality that I think there's three core areas that are really interesting, probably for, I'd say, four core areas which are really interesting. For the for the average PPC guy, maybe I'm biased, but the course for things where I've always think, does it actually have an impact? The first one would be brand search. The second one the advertising in general. Because I've had plenty of you know, sort of, for instance, fashion clients, right? Fashion is notoriously difficult anyway, where Google Ads basically just didn't drive any revenue, right? It was just, it was measuring conversions based on a touch point from p max or something, but obviously not actually adding to the bottom line. YouTube and the fourth one would be, I just had it on the tip my tongue, which I was thinking about anyway. Yeah. So the, I'd say these would be the sort of the core three ones that I would sort of think about. So in the context of PPC, how does one test that? What? Because, like one. Oh, and sorry, the fourth one was additional budgets, right? So you know, thinking about because at some point you have sort of a with Google. As most people know, the more you kind of spend. There's only so much limited search volume. So in the end, you just end up spending a ridiculous, I. Amount more for conversions that are not really that. As the word says incremental, right? Like the CPA is 10x for like, three or four more sales, which what's the point? But how does one test that? Marcel? Because a lot of advertisers and business owners are always scared of losing sales, of course, when they suppress advertising,
yeah, yeah, yeah, yeah, yeah. Indeed, indeed. Like, okay, so first of all, I think, like, the core areas you mentioned, I think that is the most important. And I think, you know, like, I think you have to accept, with advertising, there's always going to be some form of cannibal, cannibalization happening, and in paid search, that's really clearly, if you don't advertise, there's still going to be organic listings, you know, and that. And that's especially the case with branded search, of course, like if you don't advertise, sometimes there's also little competition, and then organically, you're ranked at the top, yeah, obviously very low incrementality, but those effects you have, like, on non branded as well, and in certain verticals, you have it even worse. You know, like, exactly what you said, fashion is really difficult. And then there's also kind of what you like, like, said that the second part, like, where you've kind of got your curve of impact that you're having so you're spending more. So if you're going to spend more, you're never going to get like, the same efficiency performance for every it's not going to be linear. It's always going to be like, like a curve that's going to be less. So if we're looking at, like, the incrementality that you want to calculate and test of, okay, if we wouldn't have advertised, yeah, that's you. You wanna set up a test and where, like the ads are suppressed, and compare that where, where the ads aren't suppressed. And that can be in all different types of methods, how you how you want to do that? Yeah, first of all, you get, you can do that in Google Ads itself. So you've got conversion lift studies, of course.
What for YouTube, right? Where they kind of is it still you have to take something 20 or 30 grand in hand, or something for for those types of tests,
yeah, okay. So for YouTube, you've got search lift studies where you can measure if a group was exposed to our YouTube ads, how many of them started searching. So that is super valuable for YouTube. But for paid search campaigns, performance Max, etc, you also have conversion lift studies, where Google suppresses some of the users versus a group that doesn't get suppressed. And that's actually a pretty solid incrementality test that Google offers up until now, you basically had to do this together with your Google rep, where they would set it up in the back end, and where you could run a conversion lift study. But I think what's super great like about Google Marketing life that we had is that Google now has and they basically have a new method of doing the conversion lift studies, which allows a much lower threshold of running these types of tests. So we noticed with our clients only, like the biggest clients, were eligible for conversion lift studies, and you would have to spend at least like 50 grand a month to be eligible for it. But we had clients would spend like hundreds of 1000s a month who still weren't eligible for conversion lift study, and now with their new methods, it's going to be lowered, yeah, up until like just 5000 you have to spend a month to do a conversion lift study, which I think is absolutely great. You know that that means for smaller advertisers, you can run conversion lift studies much easier in Google ads, and if you're a bigger advertiser, it allows you to slice and dice your analysis like much deeper you can do an incrementality test and see the incrementality results on campaign level, on audience level, on device level, and you can run test tests much More frequently because the threshold is much lower. So, like, one of the things that I think I like the most about Google Marketing life was the announcement that the incrementality test threshold got lower that much, you know, like, it's, I think, for, for every advertiser you can, you can just run incrementality tests, which I think I and I, we kind of have to see, like, we like, I asked, like, some of my Google reps already, like, apparently, it's also going to be possible that you can set up those tests yourself in Google ads, just like you can do now with brand lift or search lift studies for for for YouTube works. Example, and then, which I think is, would be absolutely great that you can very easily set up those tests yourself, and then sort of have, like an always on test, like incrementality roadmap, that you can keep going
amazing. Well, how would that look like with, like a Pmax campaign? This is basically then that you're going to be able to because, in the end, right? Let's say, for instance, a geo study. Let's say you're targeting the United Kingdom, because I live here, and you would sort of select randomly a number of counties that you would exclude, maybe something easier, rather than counties maybe something like you would randomly exclude, certain parts of the UK and not run ads versus but of course, you would then, sort of, you would have to, how does that look like? Because in the end, you would have to more or less take, let's say, last month's data, or the previous 30 days, or however long your test runs, look at all the sales in those areas, then suppress the ads, and then look what happened afterwards, right? How does Google know about that? Like, how would that work? Honest question.
Yeah, so Google has like, two methods of doing increment like, conversion lift study, so incrementality test in Google Ads itself. So the first one is based on user level. That's really similar to how drafts and experiments are set up. So basically, users are split up, 5050 or like, whatever like split you want, but like, let's say 5050, into two groups randomly on user level, and then one of those groups, the ads get suppressed, one of the group, one of the groups. It's business as usual, and you compare the results. The other option is that Google does it based on Geo level, and then Google randomly creates two groups based on user level. So that's pretty deep levels of address groups. And then in some locations, they exclude it. In others, they it's business as usual, and there it gets compared. And what's what's pretty nice about this is that, like both have some advantages. So if you do it on user level, you get, like, much deeper insight. So you can slice and dash your data on campaign level, audience level, device level, you can go most
because based on location, it's not there could be. It could skew the data in certain ways if it accidentally excludes London. You know, Londoners are completely different, or people from Amsterdam are probably completely different than someone who lives on the German border in the middle of fucking nowhere, basically, yeah, it
is like, and Google does do like, some Yeah, some pre analysis, where they look like that. The Art better than the regions are really similar. And I like, from what I understood, the the regions, regions which they suppress, are smaller than, let's say, an entire city or an entire province. It's more like groups of postal codes, for example. But they do check if there's, like, yeah, internal consistency between those regions that you don't get, sort of a bias there. But with a user level test, what you do need are logged in users, which creates kind of a bias in itself. So if you want to get deeper insights on campaign level, audience level, device level, then use the user level data. But if you want to get, like, the most, yeah, rigid results, which are the closest to the truth, then you have to use the geo test. The only thing with a geo test is that, yeah, you get less deep insights. Yeah. And what's also an important difference is, for the user level test, you can only do that based on the Google Ads conversion pixel. So if you're using if you've got, like, different offline sales data, or, let's say, app conversions that you're measuring, that's not possible in that type of test. But on Geo level, you can use App Data, and you can use offline conversion data. So I'm kind of, like, the bigger impact use the Geo, geo type of test of Google for, like, deeper analysis than use the user type data. And I think, yeah, it's so nice that the thresholds got so much smaller, because now you can kind of like, create that roadmap where you do, let's say, a geo lift study. Yeah, every once a year to really have a good insight of the true impact, you know, like the close to the truth. But if you want to get, like, deeper level insights, then run multiple of those. Um. A commercial Lift service based on users.
That's that's so amazing. I think that's something which, which is going to just get so much more important, like I said, because you can really, like, prove your value or non value as well, right? Which I guess is an amazing insight. Little example, we recently unfortunately lost a client. I actually published a LinkedIn post about this today, lots of, lots of reasons what had happened, but in the end, through we did an almost, not an accidental, but more or less, we proved to them that meta ads was zero impact on the business, right? And we save them about 150k a year through that, right? Because that's how much they were spending. So I think making that sort of stuff just more accessible. Obviously this, that was meta ads, which we act we don't do, but it was just completely accidental. But it just what I'm trying to say. The moral of the story is, I think you just be able to so much more easily find out what is really adding, you know, the dollar to the bottom of your business line, right? Yes, which is great. Lots of others sort of, oh, sorry, yeah, yeah. No, I think,
I think this is, like, what it is, you know, like it's, you're just trying to get closer to the truth. Like you're not also, you're not trying to confirm any performance, or you're not trying to dismiss any, any channel or any, any person's work, for example, you know, like, you just want to know what is the true impact. You know, like, if it's like, if you've got an incrementality rate of 20% or 50% or 100% you're like, we don't really care. We just want to, we just want to know, you know, like, it's, it has such a big impact on budget decisions. Like, it's not something that you have to, like, take personally. Like, Oh, the incrementality rate isn't high enough or something. It's just like you didn't have those insights. Yeah, you now know, and you're gonna be smarter in the in the future, and yeah, and you move forward, you know it's Yeah, yeah.
So you shared a couple of things already in and obviously I'd seen your your your talk as well, which was really interesting. I'm sure people can find out online somewhere and have a look at it. But what was sort of one example of, maybe you can mention the client, maybe not. Maybe you can just keep it anonymous. But what was, for instance, a test that you and your team run where you proved like something insane, right, where there was an amazing outcome, basically. So does an example come to mind or case study?
Yeah, yeah, yeah. It's, it's like, again, we don't run these tests to, like, really hope, like, the incrementality rate is very high. It's just knowing, sometimes it's just the biggest win, you know, like that, that is the biggest win. And I, I think we had a really good example of that with with a client that that we, that we ran a test for, like, it's an retailer which only has offline stores. So they do have a website where with all of their products, but if you wanted to make a purchase, you would have to go to an offline store,
like pre mark, basically, for instance, yeah, yeah, yeah. My wife would know better than me say I don't, yeah, it's,
I think if you're from the Netherlands, it's for the store action, which is everyone in the Netherlands knows them. But yeah, you know, like we did advertising for them, Google ads. And, yeah, obviously we used Google store visits to measure, yeah, the people going to the stores, which was like, like, great, you know, like we weren't able to measure anything. We got Google store visits. Great, you know, like, we can do performance marketing, and then we had a store visit rate of, let's say 10% it's not the actual one, but let's say it's 10% like our client said, like, yeah. They said, Yeah, 10% but we've got, like, really loyal customers going to the stores, like, every week, you know, like, there's plenty of people who clicked on your they would have come anyway. Basically, they would have come anyway. So you're just measuring people who would have come anyway. Yeah, so and they had like, they, it makes so much sense what I was saying. But all of a sudden we had this really, like, if you sometimes think about like, how technically advanced it is of Google to be able to measure Google store visits. It's, it's amazing, you know, like, and then we had this really cool option of measuring store visits. And then, because of what they said, like, which made a lot of sense, like, we couldn't use those insights at all. Like, if we would, if we would measure, like, 10,000 store visits, yeah, how many would have gone anywhere? Way, you know, like, so we had, like, this really advanced type of measurement, and we couldn't use it. So what we did together with them, we thought of an experimental design. And what we did with them is we wanted to create an control group, basically. So how we did it was with Ghost ads. That's a way of testing incrementality. But what we did for them is we created a fake website, basically just, you know, like, with some products on it, but you couldn't buy on the website. You basically couldn't do anything on the website. But we started advertising to that fake website. So we started advertising on the same keywords that they normally would with that fake dummy website. On ads about that fake dummy website, people could, couldn't do anything, but we ran those campaigns in the same accounts as that advertiser. So what we were able to was for that fake dummy website, if people clicked on an ad, we could measure how many of them would still go to the store. So we ran that test, and we found out that if people clicked on a completely fake ad on a fake website, the store visit rate was still, let's say 5% it's not the actual one, but let's say we still saw a store visit rate of 5% so we found out that a baseline of store visits that happened was 5% with the campaigns that we ran, it was 10% so we knew of the store visits that we measured, then the incrementality is 50% and it's not like that. We were hoping for 80% incrementality or 20% incrementality. But the fact that we knew what the incrementality rate was meant that all of a sudden we could use those store visit measurements again, and we could calculate performance target. And then we could just like, yeah, have much better performance management again. And otherwise, like, we were kind of blank, and now we could, like, stare towards the truth again.
That is such an amazing test, because maybe just to re to recount, because maybe that went over someone's head, if they're listening to it right now, in the end, if it had been this, let's say we're action, you, I've never heard of them, but that that's them, go up on the website and be loyal customers anyway. They'd know its action and then go to the store to anyway. But if it is a completely fake website which does a business that doesn't even exist, and they still end up going to the store with the address and generating then clearly, it must have been the ad which generated that store visit, and it couldn't have been their previous loyalty to the to the business, exactly. Must have been, yeah, that's, that's amazing, that wow, that's, that's, how did you come up with that? Sort of just like, because that is,
yeah, like, we really needed to notice it was it was essential. And then we, like, did some research about, like the different types of incrementality testing that you could do. We were thinking of maybe we can do like, regional tests where you say, like, Okay, we're gonna suppress like, advertising in this region, and then we're gonna look at the total amount of sales in that region. But the thing is, like, their their baseline of sales were so high that we couldn't really get like, those, yeah, we couldn't really get those tests to become significant. So we didn't do those types of tests. And then we saw like, yeah, you can also do ghost, ghost ads tests. And the only thing is, in Google ads, you must have a website where you send the traffic to, you know, like, you can't have like, a ghost ad, like, of a different let's say you have an ad that's something completely not related, you do have to still send it to a website. So then we thought, okay, let's you send it to fake dummy website. And then we set up the test, and we also did it now, like, Oh, our our store visits actually still going to be measured. So we also said to them, yeah, we're gonna spend like, a few 1000 euros on it, and we're not going to be sure if it's gonna tell us everything in the end, but it did, and then we got, like, those insights, you know, like, it costs you a bit of money, like, with every incrementality test, you're suppressing ads, so there's going to be opportunity costs, but with those insights, you're closer to the truth. And then afterwards, like, it's often the millions of budgets you're spending is going to be more effective, you know? Like, I think it's always worth it. Like, every incrementality test is worth it. I think, yeah,
I think that's one of the common barriers people would get from their boss, from their client, from them, maybe themselves, right? Internally, they don't want to do it because I'm. Like to lose sales and so on. But the bottom line is, there's one thing in your PNL which doesn't show up when that's opportunity costs, right or wasted budget. That's not something that people measure usually. So yeah, which is interesting. Maybe two more questions before we land this plane. First of all, the good old buzzword everyone is talking about, AI. How do you think in your head? Is there how AI is going to influence all of this? Going to make it easier, more accessible? Obviously it would in what way do you think?
Yeah, I think kind of, like in two or three ways. Like, first of all, AI is like, packed into, like, all the new products that Google is launching. Yeah, so performance, Max search, Max, AI, Max, you know, like it's and in there. You do see kind of that Google tries to make things less transparent, basically. So with performance Max, they want you to group branded and non branded together, you know, like their search, shopping and display and video in it, yeah, that's an absolute disaster to measure incrementality, because all of those have super different, different incrementality rates. So I think in that sense, it really complicates it. I have to say, with Google Marketing live, there were some announcements that you know, like, get more control back, which I think is great, but definitely make sure, like, with performance Max, if you're using it to take as much control as possible, like, and it's not only for incrementality testing, but also, like, in general, performance management. Like, always exclude your own brand name. You know, like, branded should not be in performance Max. Try to use, like, new customer acquisition to make sure that you're not, like, only getting new like only getting sales from existing customers, for example. So try to create a really clear distinction between branded and non branded. So in incrementality testing, it's also like one of the most important things is make like, get your segments correctly so segment branded, non branded, and different campaign types. You shouldn't group them together, because otherwise you can't really use the insights in an actual way. So I think that's kind of the first part where AI like adds in transparency, and that makes incrementality testing more difficult. On the other hand, the reason that that a threshold of the that of the conversion lift studies in Google have gotten this this much lower, has been because of technological advancements, you know, like they're using a different, yeah, I think that Bayesian model, which allows now to have these much lower thresholds. And I think that's also the result of AI in there. So I think and the one hand more difficult. On the other hand, it has gotten more accessible. So yeah, it kind of goes like to both, both ways. I think,
okay, maybe one thing. If someone wants to take one thing away, what's one test they should run straight away, besides obviously Brandon, brand and all this, know, setting up the correct segments, which thankfully we do anyway, so which is, which is already good, but yeah, what? And I preach this all the time anyway, yeah, thank God. Otherwise. What am I here with a podcast about this topic? But, yeah, what? What could someone do straight away?
Yeah, I think what you could do straight away is, like, go online and look at general benchmarks of incrementality rates, you know, like, for your business, like, that's kind of the first thing you can do. Like, go online check, like, existing incrementality studies. And there, you know, like, you will see that that the general incrementality of branded, for example, is between 5% and 20% you know, for non branded, it's like 60% 80% sometimes even higher than 100% and then, like, take those insights of benchmarks and just calculate them over your existing results. So if you've got your branded revenue, multiply it by by an incrementality rate of, let's say, 10% like, what do your numbers look like afterwards? Like, all of a sudden, do you think you're actually, like, wasting a lot of budget there, or should we be spending more non brand that you know? Like, are you maybe, are there maybe even more opportunities to be spending more? I think that's kind of the first step that you like look at. Your data at a different angle. You know, like, because it's on paper, doesn't make it true. Like, like, multiply it times like, some benchmark numbers and see what is, what does the story then look like? And then a very easy next step is to think, okay, obviously, for our business, it's going to be different from like, benchmark numbers. Okay, how can we set up, like, the first solid tests? And I think now, with like, the threshold being that much lowered in in Google ads, you can very easily set up a set up a test like, do think of a good test design and make sure that you think of the right KPIs. What is your hypothesis for? What segments do you want to test it? How you want to slice and dice it? But then now, pretty quickly, you can already set up an incrementality test. And yeah, and work with that. Yeah,
that's amazing. So maybe one last time, if someone wants to do this and wants your help, how do they get in touch with you? Marcel,
yeah, like they can reach me on LinkedIn. Marcel, small, if you search for me on LinkedIn, and, yeah, if you message me, like, I'm always, I always like to brainstorm about this stuff. Like, for each advertiser, it's going to be a bit different. Like, what type of test you want to do, and what the right test is, and what then the next steps are going to be. And, yeah, you can go to our website, root network.com, and contact us there. Yeah,
amazing. This has been unbelievably insightful. I think what? Well, a lot of things to take away from there. And I think what's so beautiful about it really is that it's almost so old school, like it's so and it feels like that's sort of that art has almost gotten lost, yeah, over the last 10 years. So I'm very, very happy you're making a comeback with it and making a revival here. So thank you very much for your time. Marcel, I
find it like one of the most interesting topics as well. It's also kind of the Holy Grail. It is sometimes a bit difficult. Luckily, it's now getting more accessible and easy. But I feel like this is kind of like the last like, like in terms of maturity in your account. This is the type of stuff that, yeah, that you get to at some point. And I think, like, if you get these insights and use them, it's like, yeah, I think it's awesome,
amazing, couldn't I think, on that note, we'll end it. Thanks. Suma, thanks so much for your time.
Yeah, thanks for having me. And yeah, see ya. Have a good one. Bye, bye, yeah, bye.