Keyword Negativation in Google Ads

Paying for irrelevant search queries? Negativate and save budget.

Google’s recent changes in keyword match types often resulted in bad-performing queries. Having control over those queries and setting negative keywords at the beginning will minimize wasted ad spend and lead to better revenue. The following approaches will also help you find negative keywords for Amazon or Bing PPC campaigns.


Low sample size on query level

You shouldn’t judge most of your search queries by only looking at their conversions. Depending on your overall conversion rate, you normally need some hundred clicks before making a good decision. Don’t look at complete queries, as they often limit your decisions.

Google removed irrelevant search terms

Google announced to remove “not significant” search queries from their reports. That means the noise in your search queries is hidden now, or at least it takes longer than before to reach the “significant” level.

Search Term Reports can get quite large

More data always gives better analysis. Select maximum and minimum value in a certain data range, and don’t apply any filters. That means your Search Term Reports easily reach file size above 100 MB. Most PPC managers first apply filters on the data because they struggle to handle those file sizes in Excel.

Google limits the maximum amount of negative keywords

Google sends more and more unique queries to your existing keywords by imposing close variant matching. When it comes to negatives, they give you hard limits per account, and you have to find a way to deal with them effectively.


N-Gram Analysis of search terms

When you look at complete queries, a lot of bad search components will stay hidden. By splitting the search terms into single words or phrases, you’ll have a completely new perspective on your data. The numbers will be higher compared to full queries and easier to get stable performance KPIs for making decisions. Another benefit: using N-Grams as negatives will also block future, unseen queries that share the same pattern. You can use our free N-Gram Analyzer tool to get first insights right away.

Entity recognition for grouping N-Grams into bigger clusters

Even with an N-Gram approach, you’ll realize that and run into cases where you have sparse data. By clustering similar N-Grams together, you’ll suddenly increase numbers. You can use entity recognition for creating those clusters. Of course, it takes some time and effort to create an entity database for your business, but it’ll eventually be worth it.

Identify close variants to existing negatives

Even when you add a negative keyword in broad match types, they won’t block queries when you use-singular in your negative and the user is searching for the plural version. Quite annoying. The only way to deal with it’s to add all those variants. Those cases increased with Google forcing its close variant matches.

Identify semantic similar words to existing negatives

Google also uses a semantic similarity to match queries to your existing keywords that wouldn’t have been triggered in the past. You can use the same technology to identify similar words and phrases to existing negatives.

Make use of the Google Ads API

If you need to handle big search term reports for multiple accounts, you should make use of the Google Ads API.-. Besides, the API is essential to automate the data processing and the rollout of identified new negative keywords. It’s also very convenient when you can publish the changes in a few seconds via the API instead of doing this manually.

Simulate the effects of new negatives before publishing

Sometimes, you can decide without using a sample size that is sufficient. Running simulations on your new negative set can avoid you making bad decisions. Want to learn a good approach? Let’s run the analysis, not on the complete data set of your queries. After applying your cut-off rules, you can check the results against your hold-out query set.

Use data driven notifications for new negative candidates

How often should you look for new negatives? Some PPC managers do it once a month. The truth is, it isn’t possible to give a good answer to that question. When using all those approaches, you’ll have huge gains in the first weeks, but it’ll be stabilized. The best solution is a rule-based approach, where you get notified when there is a relevant amount of possible new negatives. It’ll save you a lot of time.

Use your Google Analytics data

After updating on Google Search Terms Report, around 30% of the queries don’t appear anymore. But they’re still available in Google Analytics, and you should make use of them. More? We can use Bounces as an additional perspective for detecting noise. The great thing about taking bounces is that you have stable numbers after a few clicks. Even when looking at N-Grams, you always struggle with sparse data when looking only at conversions.