I knew something was wrong when three different clients pinged me on the same Tuesday morning, asking why their rankings had vanished.
Not dropped. Vanished.
One minute, we are tracking keywords at position 45, slowly climbing toward the top of page one. The next minute, the dashboard just stops at position 20 and acts like the rest of the internet does not exist.
This was September 2025. Google quietly killed the &num=100 parameter with zero warning, leaving every SEO tool that relied on it suddenly blind past page two.
I manage over 500 websites through Bright Vessel, sit on the WooCommerce Advisory Board, and have seen Google pull some moves before. But this one hit different because it broke historical data, made tools choose between accuracy and speed, and forced everyone to explain to clients why their reports suddenly have gaps.
This article is what actually happened, which tools survived, and what I did about it.
What Google Actually Broke
For years, SEO tools used a simple URL parameter to fetch 100 search results in a single request, rather than scraping 10 separate pages.
The parameter was &num=100, and it worked beautifully. Clean, fast, efficient.
Google removed it on September 10-11, 2025. No announcement. No heads up. Just gone.
The immediate damage:
- Tools that relied on it suddenly needed 10 separate queries to get the same data.
- Costs increased by 10x overnight.
- Many tools just gave up and capped results at position 20.
- Historical data now shows a permanent break starting in September.
Barry Schwartz confirmed the change at Search Engine Roundtable. Search Engine Journal documented the cost implications. And I got to spend a week explaining to clients why their rank-tracking graphs suddenly looked like someone had taken scissors to them.
How the Major SEO Tools Responded
Every tool had to choose: absorb massive cost increases, build pagination workarounds, or just give up on deep tracking. Here is what each one did.
Ahrefs: Gave Up on Deep Tracking
Status: Limited to the top 20 positions now.
What they said: Tracking beyond position 20 is not economically viable.
Reality: You are entirely blind to anything past page two.
If you track long-tail keywords slowly climbing from position 60 into striking distance, Ahrefs no longer helps you. They made a business decision, and it makes sense for them. But it does not make sense for my workflow.
seoClarity: Absorbed the Costs
Status: Still provides full 100 results.
Trade-off: Reports refresh much more slowly now.
Cost: They are eating 10x processing costs, at least for now.
I still use seoClarity as a primary tool because I need complete data. But I am not naive. Those costs will eventually be passed on to customers. I am just enjoying the window before the price increase email lands.
STAT and Moz Pro: Fixed with Pagination
Status: Restored complete data through automated pagination.
Issues: Historical graphs show a clear break from September forward.
Performance: Back to normal, just with visible data gaps you have to explain.
STAT handled this well. They built a pagination solution that works. The data is complete again. But every client report now needs a footnote about September 2025, and that footnote will live in those reports forever.
Accuranker: Permanent 20-Position Cap
Status: They openly admit they cannot track beyond position 20.
Impact: Massive blind spot for long-tail strategies.
Alternative: You need manual checks or different tools.
Accuranker is out for anyone tracking competitive keywords outside the top 20. If your entire strategy lives on page one, you may be fine. But that is not how I work.
DemandSphere: Slow but Complete
Status: Restored the top 100 results eventually.
Performance: Significantly slower, frequent timeouts.
Reality: It works, but barely.
I tested DemandSphere after the change. The data comes back, but you will age waiting for it. For weekly check-ins, it may be acceptable. For anything real-time, forget it.
What This Means for Your SEO Work
This is not just a technical hiccup. Google made a strategic move to limit third-party access to search data, and more restrictions are likely to follow. Here is what actually matters.
Higher Costs Are Coming
Tools that maintain full tracking will pass increased costs to users. I would bet on 25-40% price increases over the next year. It could be sooner.
Plan your budgets accordingly.
Limited Visibility Destroys Long-Tail SEO
If you track keywords gradually moving from positions 50+ into higher visibility, you are now flying blind with many tools.
This hits hardest in competitive niches where the real gains lie outside the top 20. You know, actual SEO work instead of just watching branded keywords sit at position one.
Historical Data Is Permanently Broken
September 2025 creates a visible break in every trend graph. Client reports look inconsistent. You need to explain the gap every single time.
I added a standard note to all my reports:
“Google removed a key data access method in September 2025. The break in this graph is industry-wide and affects all third-party SEO tools. Our tracking methodology remains consistent before and after this date.”
Copy that if you want. Save yourself the repetitive explanations.
Tool Diversification Became Critical
Relying on a single rank tracker is now risky. I use multiple tools and cross-validate data to maintain accuracy.
This costs more. It takes more time. But it also means I won’t be blindsided when Google changes something again.
My Current SEO Tool Stack
Based on testing across 500+ client websites, here is what actually works.
Primary tracking:
- seoClarity for complete data, even though reports are slower.
- STAT for the pagination solution that works reliably.
Supplemental validation:
- SEMrush has a different methodology and is a good backup.
- Google Search Console for first-party data that Google cannot kill.
- Manual spot-checks for critical keywords, because sometimes you just need to open an incognito window and look.
Alternative solutions I keep around:
- Sistrix is a European tool with different data collection methods.
- SpyFu for historical data and competitive analysis.
I do not love paying for this many tools. But I love explaining data gaps to clients even less.
How I Handle This at Bright Vessel
We manage SEO programs for 100+ Endeavor Schools locations and other enterprise clients. When Google broke rank tracking, I had to move fast.
Multi-tool validation:
- Primary: seoClarity for complete datasets.
- Secondary: STAT for historical consistency.
- Validation: Google Search Console data as the truth layer.
- Manual: Critical keyword spot-checks when something looks weird.
Enhanced Google Tag Manager setup:
We strengthened our Google Tag Manager implementations with server-side tracking through Stape.io to capture more granular organic performance data. When third-party tools fail, first-party data becomes everything.
Client communication strategy:
- Document the September 2025 change in every report.
- Explain gaps in historical trend data upfront.
- Shift the focus to performance-based metrics such as traffic and conversions.
- Supplement rank data with Google Search Console insights that clients can verify themselves.
The goal is never to let a client think we broke something. Google broke it. We just adapted faster than most agencies.
Bottom Line
Google’s &num=100 removal is not just a technical change. It is a strategic move to limit third-party access to search data.
More restrictions are coming. Count on it.
What to do now:
- Diversify your rank tracking tools immediately.
- Stop relying on any single data source.
- Strengthen your use of Google Search Console and reporting.
- Prepare clients for higher tool costs over the next year.
- Focus on performance metrics beyond rankings, since rankings are just a proxy for traffic.
Tools that still work well:
- seoClarity, complete but slow.
- STAT, paginated solution works.
- Google Search Console, first-party data, Google cannot kill.
Tools with significant limitations:
- Ahrefs, top 20 only.
- Accuranker, top 20 only.
The SEO industry has survived worse Google changes. Agencies that adapt quickly and maintain measurement sophistication will come out ahead.
Those still relying on single tools and basic rank tracking will spend the next year explaining gaps to clients and losing accounts to agencies that saw this coming.
At Bright Vessel, we built redundancy into our measurement stack specifically for situations like this. When Google changes the rules without warning, having multiple data sources and validation processes keeps client reporting accurate and trustworthy.
And when the following change comes, because it will, we will already have the infrastructure to adapt.