
For years, SEOs and rank-tracking tools have relied on the &num=100 URL parameter in Google Search URLs to see 100 organic search results in one go, instead of the default 10. It made gathering ranking data efficient, especially for those tracking lots of keywords. But in September 2025, things changed: Google has started disabling or ignoring &num=100. This “quiet” change is causing disruption across the industry.
What is &num=100?
The &num=100 parameter is a query string you append to a Google Search URL, for example:
https://www.google.com/search?q=seo+tools&num=100 |
This tells Google to try to display 100 organic search results on one page. Without it, you typically get 10 results per page, so deeper results require navigating/pagination.
Why Google Might Remove or Block It
Possible reasons include:
- Reducing load / scraping load on Google’s infrastructure.
- Preventing bots or automation from scraping large SERP sets easily.
- Cleaning up impression data (bots fetching many results may have inflated impression counts).
- Ensuring more consistent and controlled access to SERP data.
What’s Changing: Observations So Far
- Starting around September 10–14, 2025, people noticed that appending &num=100 often does not work reliably. Sometimes it returns only default (≈10) results.
- Behaviour seems to vary: signed-in vs signed-out users; by browser; by IP; by region. In many cases, the parameter is intermittently ignored.
- Rank-tracking tools and SEO platforms are reporting that their deeper ranking data (outside top 10 / first few pages) is now harder or more expensive to collect
- Google Search Console data (desktop impressions, average position) has shifted, possibly reflecting the drop in bot-driven SERP fetches that may have used &num=100.
SEO Tools: Affected vs Less Affected
Here are some of the tools known or reported to be affected, plus what seems less clear (or tools that may be adapting more easily). Not all tools have made public statements; some analyst/community posts give hints.
Tool / Provider | Status / Reported Impact | Notes & Workarounds |
Semrush | Affected | Semrush confirmed “significant operational impact” – needing to adjust for new request structure since &num=100 no longer reliably works. |
AccuRanker | Affected | AccuRanker publicly said that the &num=100 parameter works ~50% of the time. So inconsistent behavior is impacting their tracking. |
Keyword Insights | Affected | They explicitly stated that their rankings module is impacted, and that instead of retrieving 100 results with one request, they now need ~10 requests. |
Other rank-tracking / SERP API providers (general) | Affected | Many smaller and larger providers are reporting that their cost goes up, tracking delays, or missing data. Community posts (on Reddit, etc.) list tools like KeySearch as having delays/backlog. |
Ahrefs, Moz, etc. | Unclear / likely affected | While they are large tools with more resources, I haven’t found a clear public statement yet for some of them. But given their reliance on SERP data for deep-ranking, long-tail etc., it’s highly probable they are affected in some way. Community chatter suggests less visibility for deeper ranks, but Ahrefs hasn’t been highlighted as badly “broken” publicly so far. |
Search Explorer / Tools focused only on top 10 ranks | Less affected | If a tool is primarily focused on monitoring top 10 / first page keywords, the impact is less severe, because most of that data is still more reliably fetched and is where most traffic and interest lies. The increased cost / delay mostly hits tools trying to pull positions 11-100. |
Custom scrapers / internal tools | Varies | Organizations that built their own systems are being forced to adapt (pagination, more frequent requests, perhaps more proxy / IP management). Some will feel it harder than vendors with scale. |
Implications for Different Kinds of Users
- Agencies / Enterprises tracking many keywords may see big cost and time increases, especially for deeper keyword sets.
- Smaller businesses / tools may decide to only track top 10-20 positions to limit cost.
- Analysts & SEOs measuring long-tail or detecting pages “on the verge” (positions 50-100) might lose early warning signals.
- Clients / reporting may see shifts in metrics like “average position” that look positive (because missing low ranked results pull averages up), but are less about real changes in performance.
Suggested Adjustments & What Tools Are Doing
- Many tools are implementing pagination: fetch in batches of 10 results instead of trying for 100 in one.
- Some are increasing pricing (or plan limits) or limiting how many keywords can be tracked deeply by default.
- Others are optimizing infrastructure: caching, limiting how often deep SERPs are fetched; sampling only for some keywords.
- Monitoring Search Console trends to reset baselines (since impression / position metrics may have shifted with this change).
The removal or disabling of the &num=100 parameter is not just a minor annoyance — it changes core assumptions many SEO tools have relied on for years. Some tools are more affected than others, especially those tracking beyond page 1. Everyone in the SEO space will need to adapt: change how they collect data, what they report, and how they budget for infrastructure.