Keyword Negativation in Google Ads

Paying for irrelevant search queries? Negativate and save budget.

Google’s recent changes in keyword match types resulted in more bad performing queries. Having control over those queries and setting negative keywords at an early stage will minimize wasted ad spend and drive revenue. The following approaches will also help you find negative keywords for Amazon or Bing PPC campaigns.


Low sample size on query level

Most of your search queries don’t have enough clicks to judge them by only looking at their conversions. Depending on your overall conversion rate for your business you normally need some hundred clicks before making a decision. Don’t look at complete queries – they limit you a lot in making decisions!

Google removed not relevant search terms

Google announced to remove “not significant” search queries form their reports. This means that noise in your search queries is hidden now or at least it takes longer than before to reach the “significant” level.

Search Term Reports can get quite large

More data is better. Take the biggest date range available and do not apply any filters. Yes, this means your Search term reports easily reach files sizes above 100 MB. A lot of PPC managers apply filters on the data before because they struggle to handle those file sizes in Excel.

Google limits the maximum amount of negative keywords

Google is sending more and more unique queries to your existing keywords by forcing their close variant matching. When it comes to negatives they give you hard limits per account and you have to find a way to deal with them in the best way.


N-Gram Analysis of search terms

When you just look at complete queries a lot of bad search components will stay hidden. By splitting the search terms into single words or phrases you will have a completely new perspective on your data. The numbers will be way higher compared to full queries and it will be easier to get stable performance KPIs for making decisions. Another benefit: Using n-Grams as negatives will also block future, unseen queries that share the same pattern. You can use our free n-Gram Analyzer tool to get first insights right away.

Entity recognition for grouping N-Grams into bigger clusters

You will realize that even with a n-Gram approach you will run into cases where you have sparse data. By clustering similar n-Grams together you will suddenly increase numbers. You can use entity recognition for creating those clusters. Of course it is some work to set up an entity database for your business, but I can tell that it is worth it.

Identify close variants to existing negatives

Even when you add a negative keyword in broad matchtypes the will not block queries when you use e.g. singular in your negative and the user is searching for the plural version. Quite annoying. The only way to deal with it is to add all those variants. Those cases increased with Google forcing its close variant matches.

Identify semantic similar words to existing negatives

Google uses also semantic similarity to match queries to your existing keywords that would not have been triggered in the past. You can use the same technology to identify similar words and phrases to existing negatives.

Make use of the Google Ads API

If you need to handle big search term reports for multiple accounts you should make use of the Google Ads API to handle the process. If you also want to automate the data processing and the rollout of identified new negative keywords, then the API is essential. It is also very convenient when you are able to publish the changes in a few seconds via the API instead of doing this manually.

Simulate the effects of new negatives before publishing

Sometimes it can happen that decisions are made without using a sample size that is sufficient enough. Being able to run simulations on your new negative set can avoid you from making poor decisions. A good approach is to run the analysis not on the complete data set of your queries. After applying your cut-off rules you can check the results against your hold out query set.

Use data driven notifications for new negative candidates

How often should you look for new negatives? Some PPC managers do it once a month. The truth is, it is not possible to give a good answer to that question. When using all those approaches you will have huge gains in the first weeks. But this will flatten out. In my opinion the best solution is a rule based approach where you get notified when there is a relevant amount of possible new negatives. This will save you a lot of time.

Use your Google Analytics data

After Google made the recent changes to the search term report, around 30% of the queries do not appear anymore. But they are still available in Google Analytics and you should make use of them. Another great thing is that we are able to use Bounces as an additional perspective for detecting noise. The great think about taking bounces is that you have stable numbers after a short amount of clicks. Even when looking at n-Grams you always struggle with sparse data when looking only at conversions.