
Google Ads Unleashed | Winning Strategies for E-Commerce Marketers
Welcome to "Google Ads Unleashed," the ultimate podcast for anyone who wants to harness the power of Google Ads to boost their online business. Whether you're an agency owner, E-Commerce marketer, or just someone who's interested in digital advertising, this show is for you.
In each episode, we'll dive deep into the world of Google Ads, exploring the latest strategies, techniques, and best practices for creating effective ad campaigns that deliver real results. Whether you're a seasoned pro or just getting started, you'll find plenty of valuable insights and actionable tips to take your advertising game to the next level.
We also bring in expert guests to share their insights and experiences, so you can learn from the best in the business. Our guests include successful E-Commerce entrepreneurs, marketing professionals, and Google Ads specialists who offer practical tips and advice.
With Google Ads constantly evolving, it can be hard to keep up with the latest trends and changes. That's why we're here to help. We break down complex topics into easy-to-understand language and provide actionable advice that you can implement right away.
Connect with Jeremy Young on LinkedIn for regular Google Ads updates, or email him on jeremy@younganddigital.marketing
Google Ads Unleashed | Winning Strategies for E-Commerce Marketers
The Simple Trick to Find Winning Search Term Patterns
Struggling to spot wasted ad spend in your search terms?
In this episode, host Jeremy shares a powerful but underused strategy: how to run an N-gram analysis to uncover hidden patterns in your Google Ads data. You’ll learn how to identify one- and two-word phrases that drive up your CPA, how to automate the analysis using ChatGPT, and why this method is perfect for summer account cleanup.
Want to finally know if words like “Amazon” or your competitor’s brand are killing performance? This episode is your blueprint.
Get your free 30 minute strategy session with Jeremy here: https://www.younganddigital.marketing/
Scale your store with 1:1 coaching: https://www.younganddigital.marketing/1-2-1-coaching
Hello and welcome back to Google Ads unleash guys. Hope everyone is doing fabulously this Monday morning and is going through summer unscathed. Because most businesses, of course, summer is a time where it's maybe not as busy, and that's okay, because that gives us the time to do some really fancy stuff and to do some house cleaning in our ad accounts. And today's episode is exactly about that, about one practical tip, which I really want to show you and which I wanted to share with you, because it's been an absolute game changer to find some massive inefficiencies in ad accounts, and the keyword here is an n gram analysis. What on earth is an n gram? So basically an N gram is a language analysis model, right? So, and basically what it does is you try and define patterns within your search terms which produce anomalies in the ad account. Typically, we have a problem as Google Ads advertisers, and that is when we look at shopping, when we look at search very often, yeah, we do, like, for instance, we do negative keyword research on a weekly basis. We have scripts for it. We do we use all sorts of tools, but you have 1000s of search terms, usually, right? Especially with shopping campaigns, you have 1000s upon 1000s of search terms every single week. It is absolutely impossible to go through all of them, so you need to use some tools to figure out how to effectively negate certain search terms to get the optimal result. And the reality is, we don't have time to do to just manually do that, nor do we have the mental capacity to actually understand patterns in our search terms. A very common example, for instance, which I get asked a lot is, should I negate my competitors names? Should I negate Amazon? Should I negate W words, such as, what? Why? Whatever, etc, etc? And the answer is, I don't know it depends, right? Sometimes these kinds of search terms convert, sometimes they don't. And I'd like to follow the evidence, and the evidence is an engram analysis. So let's explain first. So what, what, what this is for, and what an N gram is. So like I said already, an N gram analysis is basically a language analysis model by which you want to look at n grams of words, or a combination of words broken down by the number of words in that combination. Sounds really complicated, but I'm going to give you an example you can do this with anything, right? You can do this with your search terms. You can do this with books. You can do this with anything. And you can break it down in so called uni grams, which is one word, bigrams, which is two words, or trigrams and quadruple grams and and so on. I'm not even sure there's called quadruple grams, because there's usually no point in making more than a tri gram or big grammar unigram analysis. And I'm going to give you an example in a sentence. So I've got a sentence here that is, Hi, there, everyone. Okay, that's the sentence, hi there. Everyone, three words, okay. If I did an engram analysis on this sentence, I would have three unigrams Because it contains three unique words, hi there and everyone. If I did an engram analysis and look for bigrams in here. I actually have two bigrams in there. The words Hi, there is one combination and there everyone, that's two bigrams. Okay. In theory, I'd have a third bigram as well, and that would be, hi everyone, right? So hi there, there everyone. And Hi everyone. So the in theory, have three bigrams here, basically words that occur in any combination in the sentence. Usually, though, you would want them in consecutive order, so you would have, traditionally, you would say you'd have only two bigrams there. So again, the full sentence is, Hi, there, everyone and I. The two bigrams are high, there and there. Everyone? If I did a trigram analysis, I would have just one which would be, Hi, there, everyone. It's the full sentence, right? So the idea is, is you want to look at a big sample of words, and your search term is a big sample of words, especially if you're going all time, you will probably see dozens of 1000s, if not hundreds of 1000s, of search terms. And you can actually do an N gram analysis on all of them search terms, and see how often they occur, and do a unigram and a bigram analysis on those search terms. The cool thing with that is, is what you do over time is you uncover if there are any patterns related to your CPAs or your ROAs for those search terms. Example, like I said, if I would analyze, does the word Amazon, regardless of where it is in the search term, cause a high CPA in my ad account, I could do a uni gram and gram analysis of all of my search terms, looking for the word Amazon, and As a result, see if, on average, if that unigram is in my search term, does it produce a higher CPA? So I appreciate this is very abstract, but what you basically do is uncover those kinds of trends. And the way you go about this is pretty straightforward. I would simply download all of my search terms within a timeframe you want to analyze, and you want to ask chatgpt for this, because here AI is your friend, and AI will be able to help you figure out if you can find certain n grams of this, of this quality. So the prompt that you want to use here is attach other search terms for my ad account. Please perform an N gram analysis, hiding highlighting any one word or two word engrams that shows significantly higher CPAs than the account average of x right, which you can then define, of course, what the account average is all show no conversions at all, after having spent at least x or y over all time. Then actually it will produce this for your, for your given number of search terms, and it's mega cool. If you then download those search terms, you can very quickly see whether there are certain wasted n grams within your ad account. So if you are watching the video of this, you will now see me open in the document where I did this with one of our ad accounts. These guys, they basically sell fermented plums, which make you go to the toilet, basically better. Which I know is really weird, but here you we found certain very, very interesting things which we wanted to have a look at to see whether it's worth excluding in the long run. And I've asked it for one and two word n grams, and we found, for instance, that very we asked ourselves for a long time, there's a big supermarket in Germany called DM, and we looked at search terms for a while and wondered, if someone searches for our product and adds the word DM at the end or within the sentence, does it actually produce ridiculous CPAs? And should it be a negative keyword? As it turns out, it shouldn't, right? So as it turns out, it actually has an acceptable CPA regardless of where that word is in the sentence. And over time, it's spent 1800 and has actually had a very, very good CPA. So we're keeping that in there, which is really, really cool. In fact, even when we have the word plum and DM in the sentence as a as a, by word as a bigram, then the account CPA is really, really low. Sorry, the
overall CPA is really low, even, which is really cool. We also uncovered, for instance, that if you have the same analysis for the word where, for instance, or where, what, etc, etc. It appears that the account CPA is actually really acceptable, etc, etc. We even looked at one competitor, another direct competitor. There, we saw that CPA is really high, so we had to exclude that competitor. Now. From from our ads, because it's just, you know, wasted ad spend, etc, etc, etc. So we found, like, a lot, a lot of single words and two word combinations, which overall in the ad account, produce super, super high CPAs, and which should be excluded from the ads. And that's why I would sort of suggest to run this every quarter to see if you find any cool search terms or search term combination combinations which, over time, have accumulated quite a lot of spend and which are producing outlying results in your ad account. So to sum it up, if you want to do some housework, use an engram analysis. An Engram analysis is a word or combination of words broken out by the number of words in that combination. I suggest unigrams of bigram, so one word combinations, so just one word or two word combinations as a maximum from three. It gets a little bit too complicated, and you don't have enough data for that. And what you have to do is download your search term data over a given period, I suggest, all time, and you let chat GBT perform that and find some really cool outline results in your ad account. And if you want more tips like that, then simply give the podcast a follow. Also. If you like it, please like and subscribe. If you also are regular listener, please leave a review if if you haven't done so, it really helps the podcast and it helps the campaign. If you have any personal questions, you can always reach out to Jeremy and Google ads on LinkedIn, or send me an email at Jeremy at young and digital dot marketing. You can also book a one to one consultation with us via the website young and digital dot marketing. We have some spaces coming up in September for new clients. So if you need q4 preparation done properly, if you need to get brush up your Google Ads account to level where you think you can make the most of it in the next half a year, please reach out. We also have some one to one spaces opening up because we have just hired and we've got more capacity within the agency. So get in touch if you need any help. This has been Jeremy young, personal Google Ads expert, and I wish you a happy and productive week ahead.