HomeLearnGoogle Ads › How to read a search terms report

Google Ads · Diagnostics

How to read a Google Ads search terms report (and what to do with what you find)

Published April 22, 2026 · 11 minute read · By Stan Tscherenkow

Quick answer

The Google Ads search terms report lists the actual user queries that triggered your ads, not the keywords in your account. To read it, open Insights then Search terms, set a 7 to 30 day window, sort by cost, and review the top 40 rows. Flag irrelevant queries for the shared negative list. Promote converting queries to exact match. Review weekly.

Key takeaways

What this guide will give you

  1. The search terms report shows what Google is actually spending your budget on. It is not the same as your keyword list. In broad match accounts the difference can be 60% or more.
  2. Every irrelevant search term in the report is money spent on a query you did not approve. The report makes wasted spend visible and actionable.
  3. The search terms report is the source material for your negative keyword list. Reviewing it weekly and adding negatives is not optional maintenance. It is structural account management.
  4. High-volume search terms that are converting should be added as exact match keywords. Leaving them to broad match means the algorithm controls the match, not you.
  5. Search terms that match your keywords but have zero conversions over 30+ days and meaningful spend are a signal to add as negatives or reduce match type precision.
  6. The search terms report reveals intent signals your keyword list did not anticipate. New product categories, competitor comparisons, and geographic modifiers all appear here before they appear in your keyword strategy.

What this article covers

  1. Why the search terms report is not the same as your keyword list
  2. How to access and read the report
  3. Identifying wasted spend: what to look for
  4. Building your negative keyword list from search terms data
  5. Identifying converting terms to add as exact match
  6. The weekly review process: what to do and how long it takes
  7. A four-step weekly search terms review process
  8. Questions operators ask about the search terms report
  9. Final thoughts

Most Google Ads accounts are optimized from the wrong document. Operators look at the keyword list, tune bids, rewrite ad copy, and never open the report that shows what Google actually bought with the budget. The search terms report is that report. It is the single diagnostic that separates accounts run by the keyword list from accounts run by what the algorithm is delivering against the keyword list. After twenty years of paid media work and 40-plus Google Ads account audits, the pattern is consistent: the accounts that get reviewed weekly at the query level outperform the accounts that do not, by margins large enough to change what the business can afford to spend. For the full pillar, see the Google Ads guides collection.

Why the search terms report is not the same as your keyword list

The keyword list is a statement of intent. The search terms report is the record of what Google delivered. These are not the same thing, and in broad match accounts they diverge far enough that most operators would not recognize their own account from the query log. A single broad match keyword for "running shoes" can trigger on "marathon training advice," "nike outlet near me," "shoe repair," and three dozen other queries that share no commercial intent with the original bid. Smart bidding makes this worse, not better. The algorithm will chase conversion signal into query space the advertiser never approved.

What the divergence looks like in practice:

Knowing which end of that distribution you are on is the first honest read of the account.

How to access and read the report

The path in the current Google Ads interface is Insights and reports, then Search terms. From a specific Search or Shopping campaign, the Insights tab contains the same view scoped to that campaign. Set the date range to either the last 7 days for weekly maintenance or the last 30 days for a monthly strategic review. Sort by Cost descending. This is the only sort that matters for a diagnostic pass. Sorting by impressions or clicks surfaces volume, which is a vanity view. Sorting by cost surfaces where the budget went.

Read the top 40 rows. In most accounts, those 40 queries represent 65 to 80 percent of total spend. The tail below row 40 matters for long-term pattern spotting but does not drive the weekly decisions. Columns to keep visible:

Hide clicks and impressions during a diagnostic pass. They pull attention toward volume and away from money.

Identifying wasted spend: what to look for

Wasted spend in the search terms report has a recognizable shape. The category that surfaces fastest is branded queries for companies you do not compete with, appearing because broad match decided the queries were topically related. Competitor brand names you did not intentionally target. Job search terms ("careers," "jobs," "salary," "hiring"). Free-intent modifiers ("free," "DIY," "how to," "template," "tutorial"). Adjacent category terms where someone is searching for a larger, smaller, or different product.

The five patterns to flag on sight:

The test I use is blunt. Would the business knowingly pay $15 to be shown for this exact query? If the answer is no, the query is wasted spend and belongs on the negative list. The marginal time cost of flagging 10 wasted queries is about three minutes.

Building your negative keyword list from search terms data

Negative keywords are the operational output of the search terms report. The review has no value if the flagged queries do not end up excluded. Most accounts fail here because negatives get added ad-hoc at the ad group level, scattered across campaigns, with no central list anyone can audit. The correct structure is a single shared negative keyword list applied to every search-intent campaign, plus one or two campaign-specific lists for genuinely localized exclusions.

What to decide for each flagged query:

Add negatives in batches, not one by one. Selecting 15 rows, clicking Add as negative keyword, and choosing the shared list takes 90 seconds. Doing the same work one query at a time takes ten minutes and produces the same outcome. Batch the decision once you have read the top 40.

If you want the exact priority list for your specific account rather than the general framework, the Conversion Second Opinion delivers it in 72 hours.

Identifying converting terms to add as exact match

The offensive half of the review is as important as the defensive half. Search terms that converted deserve to be promoted into the account as exact match keywords in their most relevant ad group. The reason is not cosmetic. Leaving a proven converting query in broad or phrase match means the algorithm decides when to show against it. Promoting it to exact match means the query has its own keyword with its own quality score, its own bid, and its own place in the auction insights. That is control, not semantics.

The promotion criteria I use:

Once promoted, add the original broader-match version as a negative in the old ad group if the query was being captured there inefficiently. Otherwise the new exact match keyword will compete with its own ancestor for impression share.

The weekly review process: what to do and how long it takes

A weekly search terms review on a well-structured account takes 20 to 30 minutes. On a neglected account the first review takes 90 minutes because the backlog is long. That is fine. The first review is a one-time cleanup. After that, the weekly cadence keeps the backlog from forming again. Accounts that get reviewed every 60 days produce a different shape of maintenance: the negative list is always catching up, the algorithm is always training on queries that should have been excluded, and the monthly report always has a "we found more waste" line.

What the 30-minute cadence looks like:

That cadence is the difference between account management and account maintenance. Both are necessary. Only one is billable in a well-run engagement.

The framework

A four-step weekly search terms review process

  1. Set the date range and sort by cost (5 minutes)

    Insights, Search terms. Last seven days. Sort by cost descending. Keep search term, match type, cost, conversions, and cost per conversion visible. Hide clicks and impressions during the diagnostic pass.

  2. Flag wasted queries for negatives (10 minutes)

    Read each of the top 40 queries. Select any query the business would not knowingly pay for. Add them as a batch to the shared negative keyword list, not per ad group. Choose match type per query intent.

  3. Promote converting terms to exact match (10 minutes)

    Filter for queries with at least one conversion inside the cost per acquisition target and at least three clicks. Add each as an exact match keyword in the most relevant ad group. Ad copy should already align.

  4. Document and move on (5 minutes)

    Log the date, the number of negatives added, the exact match keywords promoted, and the weekly spend on irrelevant queries. One spreadsheet row. The log makes the trend visible across months.

Questions operators ask about the search terms report

How often should I review the search terms report?

Every seven to fourteen days for an active account spending $3,000 or more per month. Smaller accounts can run a 30-day cadence. The frequency is dictated by spend velocity, not preference. A week of unreviewed broad match on a $10,000 monthly budget is roughly $2,300 of queries nobody approved. Short review intervals cost less than skipped ones.

What is the difference between search terms and keywords in Google Ads?

Keywords are the words you bid on. Search terms are the actual queries typed by users that triggered your ads. Match type determines how loosely Google pairs the two. A single phrase match keyword can produce hundreds of unique search terms. The keyword is your intent. The search term is what Google delivered against that intent.

How do I add negative keywords from the search terms report?

Select the rows you want to exclude, click the negative keyword button above the table, then choose whether to add at the ad group, campaign, or shared negative list level. Most negatives belong on a shared list applied to every search-intent campaign. Account-wide negatives should live in one master list reviewed monthly, not scattered across ad groups.

What percentage of search terms should be irrelevant in a well-managed account?

Under 15 percent of total spend on irrelevant queries is a well-managed account. Fifteen to 30 percent is typical, fixable within a month. Above 30 percent is a structural problem tied to match type strategy, negative list maintenance, or thin keyword themes. The benchmark applies to spend share, not row count. Cost, not query volume, is the measure.

Can I see search terms for Performance Max campaigns?

Partially. Google released search theme and search category insights inside Performance Max, visible under the Insights tab at the campaign level. Raw query-level data is restricted. The practical implication is that negatives must be added through account-level negative keyword lists or campaign-level exclusions, not reactively from a full query list like Search campaigns allow.

Final thoughts

The search terms report is boring work. It is not the part of the account a new operator wants to spend time on. It is the part of the account that separates a diagnostic practitioner from a dashboard watcher. Twenty minutes a week of query-level review produces better outcomes than 20 hours a week of bid tweaking, because bids are a response to the conversion signal and the conversion signal is built from the queries the report shows you.

Most accounts that look "optimized" by agency standards are actually well-reported rather than well-managed. The reporting covers the conversions, the ROAS, and the month-over-month change. It does not cover whether the top 40 queries the account paid for were queries the business wanted to pay for. When the gap between those two things gets large enough, no bid adjustment can close it. Only negatives and promoted exact matches can.

When the pattern of wasted spend is too entangled to unwind alone, or when the account spans multiple campaign types with overlapping queries and the weekly review no longer fits in thirty minutes, that is the point to bring in structured help. Stan Consulting offers Google Ads management once the diagnostic is complete, and the Conversion Second Opinion is the entry point for anyone who wants the findings before the management.

Related: the full marketing guides collection covers Shopify, conversion, strategy, and agency management.

The engagement format

If this is bigger than a campaign fix, the Revenue Sprint handles the full build.

$5,000. One engagement. Diagnosis, build, and fix. No retainer after.

See the Revenue Sprint