We will see more and more inefficient search patterns and wonder how our budgets disappeared very soon. But there are ways to outsmart Google and deal with close variants.
Google wants to drive even more traffic to your keywords by matching search queries that wouldn’t be triggered in the past. Their way to achieve this is how the keyword match types work: Close variants were introduced and exact is not exact anymore. This gives Google a lot of power because they are in control of what “close variant” means.
For us marketeers it means that we will see more and more inefficient search patterns we have to handle somehow. Here are my strategies for striking back:
Monitor closely how Google matches queries to keywords
My guess is that there will be changes under the hood so we should monitor the system somehow, here are some approaches for that:
- Impression share of close variants over time
- The unique number of queries over time
- The unique number of single words over time
In my example chart you see that the number of unique queries and words exploded form one day to another (after 2020-01-15) without changing anything in the account. This is the main driver for the increasing number of clicks. SEO guys will know the timing, Google made a core update on January 2020.
If you have similar observations for your PPC Accounts please share!
In the following weeks we searched and eliminated the new “noise” in the traffic Google added to our accounts.
How to identify the “noise” in your search queries?
It is a bad idea to look on complete search queries for setting negative keywords for several reasons:
- the sample size is very low for most of the queries – this means that most of the bad search patterns will be still hidden
- if you set negatives on complete queries there will be similar queries that are still active
- You will run out of negatives on some point when you do this on query level (Google limits for shared negative sets)
A more superior approach is to transform the search queries into n-grams. This will give you higher sample sizes on bad patterns that were hidden before compared to just looking on complete queries. Another positive thing is that you will block a lot of unknown future queries that share the same bad pattern. If possible use 1-grams for your negatives – if you need a more detailed negativation drill down into 2-grams.
Google constantly adds “noise” to the traffic. A very efficient way to identify the noise is transforming the search queries into n-grams.
Share on Twitter
Even with this approach there will be thousands of words that have a low sample size – this is getting worse because Google is doing more and more of “smart matching” – and you can bet that there is a lot of noise in it.
Here are some approaches that will discover even more negatives:
- Use stemming algorithms on 1-grams to get the reduced version of a single word. This makes in possible to lookup different forms that appeared in the queries without having enough click data.
“cheapest” => “cheap”
“cheaper” => “cheap”
“cheap” as a standalone word has for example enough sample data to be categorized as bad – with the stemming approach we can easily identify other forms and set them negative as well.
I’m using SnowBall and Porter stemmers for this job. - Use distance functions (e.g. Levenshtein) to identify misspellings like “chaep” or “cheep” and add them as negatives.
- Use semantic similarity to discover similar words to cheap like “budget” or “free”. I’m using a model based on Google’s word2vec to discover those similar terms.
Put all together in a data driven process!
Every day there are new, unseen queries. This means for us marketeers to continuously search for negatives. I’m using performance based rules on n-gram level that trigger me if there are new patterns that are candidates for negativation.
In addition to that, there is second process: Checking if there are new different variants of already blocked n-gram patterns.
All together this will save you a lot of money and you are prepared for Google’s next change in matching logic.
Cheat Sheet
- Google’s close variants will drive more traffic to your keywords by matching search queries that wouldn’t be triggered in the past.
- It is a bad idea to look on complete search queries for setting negative keywords. A more superior approach is to transform the search queries into n-grams.
- Use stemming algorithms, distance functions and semantic similarity for better results.