Google Silently Ends Search Option: What It Means for SEO

google silently ends search option seo

Google quietly removed the longstanding option to show 100 results on a single search page in mid-September 2025. 

SEO experts first spotted the change around 10–12 September: the old &num=100 URL parameter that forced Google to list 100 results per page now simply stops working. 

Instead, Google only returns the usual first 10 or 20 links per page. 

The extra pages beyond the first 10–20 have effectively vanished. 

This subtle change has big ripple effects – from how analytics count impressions to how SEO tools gather ranking data.

What Changed in Google Serps

The fix means Google no longer supports listing 100 results at once. 

Attempting to force 100 results often no longer works and in many cases only one or two pages (10–20 results) are shown.

Early tests suggested it was sporadic or a test, but it quickly became permanent. 

By mid-September the change had fully stopped working and broke many rank-tracking tools. 

Google itself later confirmed that using this URL trick was never officially supported. 

So although the change was quiet, it aligns with Google’s policy against scraping.

The company appears to be limiting how much SERP data automated tools can pull at once.

This isn’t an official Google announcement; rather it was discovered by SEO practitioners. 

Within days, the option stopped working entirely. 

Google has quietly reverted to its default 10 results per page (or 20 on some screens), and the hacks to pull more pages at once have gone away.

Why It Matters for SEO and Marketing

This behind-the-scenes tweak reshapes some key SEO metrics and tools. 

Google is making SERP data closer to what a real user sees (the first page or two) and cutting off the ‘deep’ result scraping that bots often did. 

The immediate implications include:

Huge Drop in Reported Impressions

Many SEO dashboards and Google Search Console (GSC) reports immediately showed steep falls in total impressions. 

Sites suddenly appeared to lose traffic reach, even though nothing actually changed on the site itself. 

These missing impressions were mostly from results on pages 2-10 that normal users rarely scroll to see.

Loss of Tracked Keywords

With fewer results being logged, most sites also saw many of their tracked keywords disappear. 

Keywords that ranked only on deep pages (beyond the top 10 or 20) were no longer counted.

Apparent Rank Boosts

Because those low-ranking listings are gone, the remaining data looks ‘higher’ in rank. 

Average positions climbed sharply for many sites. 

Keywords that used to show up on page 3+ of Google SERPs now either don’t appear at all or have moved into page 1–3 positions. 

In reality, the sites didn’t suddenly improve; the phantom results were simply removed.

‘Inflated’ Data Corrected

Previously, some of those impressions and clicks logged by GSC and tools were actually from bots or automated scraping sitting deep in the results. 

Many now call this phenomenon the ‘Great Decoupling’ – where impressions rose without matching clicks. 

It turns out much of that was likely due to scrapers using the &num=100 trick. 

Now that it’s gone, the extra bot-driven impressions have disappeared, leaving a more accurate picture of true user interest.

How SEO Tools and Providers Are Coping

The SEO tool industry has quickly adapted. 

Major platforms acknowledged the issue and updated their guidance. 

Semrush, for example, has reassured users that their core visibility metrics (based on top-10 or top-20 positions) remain reliable. 

In practice, Semrush and others continue to track beyond position 20 when they can, but they’re warning customers not to expect perfect data from deep pages. 

Other platforms have asked whether data beyond rank 20 is truly needed, hinting they may also prioritise shallower tracking.

Other tools, like AccuRanker, also saw disruptions. 

In the days after the change, some users reported rank-tracking failures or captcha issues when trying to scrape with the old parameter. 

Without &num=100, gathering the same amount of data now requires ten times as many requests (since you get only 10 results per request instead of 100). 

This raises server load and could increase costs for tool vendors. 

If Google keeps the change, experts say tools may either need to raise prices or simply only promise reporting up to position 20.

It’s worth noting Google’s position: the company confirms that the old parameter is not officially supported. 

Google has not explicitly said whether the removal is a permanent policy or a test, but job postings for ‘Anti-scraper’ roles hint at a long-term strategy to limit scraping. 

In any case, tools have no choice but to adapt.

Recommendations for SEO Professionals

For marketers and SEO teams, the best approach is to accept this as the new baseline and adjust processes accordingly. 

Key actions include:

Audit your Tools

Check which of your SEO tools or scripts rely on the &num=100 parameter. 

If they had been fetching 100 results at once, ask the vendor how they’re handling the change. 

Many tools are switching to multiple 10-result requests or using APIs. 

Make sure any custom rank-tracking scripts paginate properly or use Google-approved services.

Re-baseline your Metrics

Treat data from mid-September onward as a reset. 

Post-change reports will naturally show lower impressions and fewer rankings. 

Compare week-over-week trends rather than against historical data. 

A drop in volume is expected.

It usually means the ‘noise’ of deep-page counts is gone. 

A site’s true performance likely hasn’t declined, the numbers are just more conservative.

Focus on Meaningful KPIs

With impressions and average positions shifting up, rely more on actionable metrics like clicks, click-through rate, and conversions. 

An improved average position after Sept. 2025 doesn’t necessarily mean your site got better; it often just means fewer low-ranked links are being counted. 

Emphasise user-centric outcomes (traffic and engagement) rather than raw impression counts.

Consider your Data Sources

If you have been analysing long-tail keywords from page 5 or beyond, be aware those now vanish from the data. 

You may need to update dashboards or reports to exclude those deep terms. 

Also consider relying more on Google’s official tools (like the Search Console API) where possible, since these will naturally reflect the new 10–20 results standard.

The Bigger Picture

It’s important to keep perspective. 

Google is known to quietly tweak how it delivers results.

This is just the latest example. 

For years, many marketers have used tricks like &num=100 to scrape more results, often inflating their metrics. 

Now Google is effectively saying, ‘we were never officially fine with that’. 

The result is a one-time shock to reported data, but a potential benefit in the long term.

Search analytics will better reflect real user behaviour.

As Google’s own guidelines have long stated, automated scraping and abuse of SERP parameters are discouraged. 

This change might even improve the reliability of Google’s own Search Console data by eliminating bot noise. 

In any case, SEO and marketing teams should adapt strategies to the new normal

Focus on the top of the rankings, double-check tools’ methods, and interpret their data with this context in mind.

For more information on this, or help with your SEO needs, get in contact with us here at Neon Atlas today.

neon atlas digital marketing logo and text

READ MORE

Leave a comment