Enhance your abilities with Progress Memo’s weekly professional insights. Subscribe free of charge!
My findings this week present Google Search Console knowledge is about 75% incomplete, making single-source GSC selections dangerously unreliable.
1. GSC Used To Be Floor Reality
Search Console knowledge was essentially the most correct illustration of what occurs within the search outcomes. However privateness sampling, bot-inflated impressions, and AI Overview (AIO) distortion suck the reliability out of the info.
With out understanding how your knowledge is filtered and skewed, you threat drawing the fallacious conclusions from GSC knowledge.
search engine optimisation knowledge has been on an extended path of turning into much less dependable, beginning with Google killing key phrase referrer to excluding crucial SERP Options from efficiency outcomes. However three key occasions over the past 12 months topped it off:
- January 2025: Google deploys “SearchGuard,” requiring JavaScript and (refined) CAPTCHA for anybody search outcomes (seems, Google makes use of plenty of superior indicators to distinguish people from scrapers).
- March 2025: Google considerably amps up the variety of AI Overviews within the SERPs. We’re seeing a major spike in impressions and drop in clicks.
- September 2025: Google removes num=100 parameter, which SERP scrapers use to parse the search outcomes. The impression spike normalizes, clicks keep down.
On one hand, Google took measures to scrub up GSC knowledge. Alternatively, the info nonetheless leaves us with extra open questions than solutions.
2. Privateness Sampling Hides 75% Of Queries
Google filters out a major quantity of impressions (and clicks) for “privateness” causes. One 12 months in the past, Patrick Stox analyzed a big dataset and got here to the conclusion that nearly 50% are filtered out.
I repeated the evaluation (10 websites in B2B out of the USA) throughout ~4 million clicks and ~450 million impressions.
Methodology:
- Google Search Console (GSC) offers knowledge by means of two API endpoints that reveal its filtering conduct. The combination question (no dimensions) returns complete clicks and impressions, together with all knowledge. The query-level question (with “question” dimension) returns solely queries assembly Google’s privateness threshold.
- By evaluating these two numbers, you possibly can calculate the filter price.
- For instance, if combination knowledge reveals 4,205 clicks however query-level knowledge solely reveals 1,937 seen clicks, Google filtered 2,268 clicks (53.94%).
- I analyzed 10 B2B SaaS websites (~4 million clicks, ~450 million impressions), evaluating 30-day, 90-day, and 12-month intervals towards the identical evaluation from 12 months prior.
My conclusion:
1. Google filters out ~75% of impressions.

- The filter price on impressions is extremely excessive, with three-fourths filtered for privateness.
- 12 months in the past, the speed was solely 2 proportion factors increased.
- The vary I noticed went from 59.3% all the best way as much as 93.6%.

2. Google filters out ~38% of clicks, however ~5% lower than 12 months in the past.

- Click on filtering is just not one thing we speak about lots, however it appears Google doesn’t report as much as one-third of all clicks that occurred.
- 12 months in the past, Google filtered out over 40% of clicks.
- The vary of filtering spans from 6.7% to 88.5%!

The excellent news is that the filter price has gone barely down over the past 12 months, in all probability on account of fewer “bot impressions.”
The dangerous information: The core downside persists. Even with these enhancements, 38% click-filtering and 75% impression-filtering stay catastrophically excessive. A 5% enchancment doesn’t make single-source GSC selections dependable when three-fourths of your impression knowledge is lacking.
3. 2025 Impressions Are Extremely Inflated

The final 12 months present a rollercoaster of GSC knowledge:
- In March 2025, Google intensified the rollout of AIOs and confirmed 58% extra for the websites I analyzed.
- In July, impressions grew by 25.3% and by one other 54.6% in August. SERP scrapers by some means discovered a means round SearchGuard (the safety “bot” that Google makes use of to stop SERP scrapers) and brought on “bot impressions” to seize AIOs.
- In September, Google eliminated the num=100 parameter, which brought on impressions to drop by 30.6%.

Quick ahead to at present:
- Clicks decreased by 56.6% since March 2025.
- Impressions normalized (down -9.2%).
- AIOs lowered by 31.3%.
I can not come to a causative variety of lowered clicks from AIOs, however the correlation is powerful: 0.608. We all know AIOs scale back clicks (makes logical sense), however we don’t know precisely how a lot. To determine that out, I’d need to measure CTR for queries earlier than and after an AIO reveals up.
However how are you aware click on decline is because of an AIO and never simply poor content material high quality or content material decay?
Search for temporal correlation:
- Monitor when your clicks dropped towards Google’s AIO rollout timeline (March 2025 spike). Poor content material high quality reveals gradual decline; AIO impression is sharp and query-specific.
- Cross-reference with place knowledge. If rankings maintain regular whereas clicks drop, that indicators AIO cannibalization. Examine if the affected queries are informational (AIO-prone) vs. transactional (AIO-resistant). Your 0.608 correlation coefficient between AIO presence and click on discount helps this diagnostic method.
4. Bot Impressions Are Rising

I’ve cause to consider that SERP scrapers are coming again. We are able to measure the quantity of impressions doubtless attributable to bots by filtering out GSC knowledge by queries that comprise greater than 10 phrases and two impressions. The possibility that such an extended question (immediate) is utilized by a human twice is near zero.
The logic of bot impressions:
- Speculation: People not often seek for the very same 5+ phrase question twice in a brief window.
- Filter: Determine queries with 10+ phrases which have >1 impression however zero clicks.
- Caveat: This methodology could seize some legit zero-click queries, however offers a directional estimate of bot exercise.
I in contrast these queries over the past 30, 90, and 180 days:
- Queries with +10 phrases and +1 impression grew by 25% over the past 180 days.
- The vary of bot impressions spans from 0.2% to six.5% (final 30 days).
Right here’s what you possibly can anticipate as a “regular” proportion of bot impressions for a typical SaaS website:
- Based mostly on the 10-site B2B dataset, bot impressions vary from 0.2% to six.5% over 30 days, with queries containing 10+ phrases and a pair of+ impressions however 0 clicks.
- For SaaS particularly, count on a 1-3% baseline for bot impressions. Websites with in depth documentation, technical guides, or programmatic search engine optimisation pages development increased (4-6%).
- The 25% progress over 180 days suggests scrapers are adapting post-SearchGuard. Monitor your percentile place inside this vary greater than absolutely the quantity.
Bot impressions don’t have an effect on your precise rankings – simply your reporting by inflating impression counts. The sensible impression? Misallocated assets should you optimize for inflated impression queries that people by no means seek for.
5. The Measurement Layer Is Damaged
Single-source selections primarily based on GSC knowledge alone develop into harmful:
- Three-fourths of impressions are filtered.
- Bot impressions generate as much as 6.5% of information.
- AIOs scale back clicks by over 50%.
- Person conduct is structurally altering.
Your alternative is within the methodology: Groups that construct strong measurement frameworks (sampling price scripts, bot-share calculations, multi-source triangulation) have a aggressive benefit.
Featured Picture: Paulo Bobita/Search Engine Journal
