25 Billion webpages!

That’s a huge figure, but what is it? Google. Yes again, it’s been now a couple of month that the major player is testing new SERP with its 3-parts update named Jagger and while all seemed to slow down and go back to normal, here they go with a new set of tests.

This time we were unable to retrieve any feedback from Chief Google engineer Matt Cutts but there is definitely something going on and it has to do with Google’s index size and multiple other factors.
Back a few months ago Yahoo search reported indexing up to 20 billion of web pages, a incredible increase which raised doubts as far as genuineness within the internet community. Webmasters used to report inflated page counts for their websites. However it was a nice try and it surely woke up Google’s most talented personnel to come up for a solution and beat the big Y!

While testing selected datacenters for upcoming algorithmic changes and, obviously bug fixes (canonical issues, supplemental and more) we were astonished to find out that the current test index would return over 25,000,000,000 indexed pages!
Considering that currently Google indexes about 9,500,000,000 (only) that’s almost a 250% increase, and the mountain view, CA based firm would be triumphant again by the end of this year on one of the search engine metrics used versus competitors Yahoo and MSN.
Currently test SERP’s can be viewed on the following datacenters: (most of the time, appears sometimes on the default Google.com)

Now we can see different results but how to check the index size ?

Very simple, go to Google.com or any of the above datacenters using the IP address in your browser then use as a query the following:


Yes, that’s right, on the top right of your browser you should have a number which would represent the total (or close) of web pages currently indexed by Google.

Obviously a larger index does not mean higher relevancy to the average searcher however SERP on these test datacenters are fairly relevant and we have noticed encouraging improvements for websites that suffered of supplemental issues which any good SEO consultant would have prevented to the best of its knowledge of course.

Another question that one could raise is simply are there really that many pages with unique content on the web? Let’s say that Google, instead of just fixing/removing supplemental results introduce a while ago would adjust its algorithm and keep them as an historical database of modified/deleted webpage and websites. The search engine is famous for what was called its “long memory” and would confirm it very soon once again.

Leave a Reply

Your email address will not be published. Required fields are marked *