Looking back at on-page strategies, there has always been discussion on the size of a website and its effect on rankings. We have seen algorithm changes as it pertains to thin content and duplicate content, as well. We noticed that sometimes when we were launching a client’s new site, that the amount of pages the site have had influenced its ranking.
One example of this was a local electrician. Most of his competitors’ websites were pretty old, with only a few pages – over all thin. We created his site with 8 pages of strong content (his wife is an English teacher). The site had information on a specific water heater that they sold. She made 1,000 pages of content as well as a post. The site was launched on a new URL and in about 2 weeks, was on page 1 for the area, as well as the brand of water heater.
In this test, we check if the more indexed pages a site has, the more it can rank.
Three sites were set up –
Each site has 1 page that was exactly the same, with the target keyword.
A sitemap was created for all three sites.
The control site was indexed first and only indexed 3 of the 10 pages during the 7-day span of this test, but remained in the #1 position.
The 100-page test site started out in the #3 position with 5 pages being indexed, but moved to #2 once more pages were indexed. Note that it was more pages than the 5-page test site. The amount of pages that Google indexed fluctuated as well. By the last day, 14 pages were being indexed. We submitted 15 to Google’s URL submitter.
The 5-page test site seemed to index all pages from the start. We only submitted the root URL in the beginning.
So, although the 10-page control site stayed in the #1 spot, the 100 site with more pages being indexed did outrank the smaller site.
Based on this test, it’s better to go big. Not all the pages will index at the rate that you might want them to, but in the end, more pages is going to be better.
More pages = better ranking.
In this video, Clint discusses this test and its results.
Test 74 – Does The Amount Of Pages Indexed Affect Rankings?
This is a little test to see if larger sites outrank smaller sites. And I would venture to say this probably was tested in the blackhat world and that’s why the concept of mass pages came out, or one of the reasons why mass pages came out. It is also why everyone says – oh, we should do more pages, you need more content, you need more, you need more, you need more in order to rank for Google.
And my argument or counter argument to that has always been, what if your growth rate exceeds your promotion rate? So you’re writing all of this great and wonderful stuff, but no one knows about it. If no one knows about it, no one’s building links to it, and no one’s building links to it, it makes it harder for Google to find it, it makes it harder for Google to find it, it’s less likely to rank.
Unless you’re looking at using a mass page site to say, target or identify easy to rank for keywords. In which case, that’s a whole another can of worms that results in things like blackhat mass paid site builds, and I say blackhat because they’re against Terms of Service, and not necessarily illegal or hacking. Well, as a matter of fact, they’re not illegal, and they’re not hacking, but they are against Google’s Terms of Service and they will technically or pretty much always, once identified, be marked as pure spam and taken out of the index.
Neither here nor there, they still work, and they’re still good testing tools, or poking tools in order to help with your keyword research stuff. The same goes with the old method of YouTube videos, and particularly YouTube Live, and why a lot of those YouTube Live software’s came out, and why YouTube Live no longer has the power that it used to have, because of those YouTube Live software’s that were allowing people to upload 3, 4, 5, 1000 videos at a time, and “poke for easy rankings”. So this test kind of goes along with that, in so much as does a 100 page website outrank a 5 page website or a 10 page website, for that matter.
In this case, there were 3 sites use, one at 10 pages, one at 5 pages, and one had 100 pages. On each one of those sites, there was a page that was exactly the same – optimized for the test key word. So each page, one page on each one of these sites 5, 10 and 100. And the page on the site that had 100 pages won the day. Here’s where it gets a little bit tricky. What you’re looking at on screen is the right up at the time, what is missing from it? Can anyone tell me what is missing from it? And the key to the question is right here. Each site has one page that is exactly the same. So if you’re thinking about “duplicate content”, more to the point syndicated content, the site that got indexed first or the page that got indexed first for that term. And all of these are the exact same, and Google’s algorithm is taught to find duplicate content. If they’re all the exact same, did this site become the one that was ranked first because it was the first one index, because each of these has the exact same content on them. Exactly the same. So there’s no canonical? Where these set with the canonical that’s out there? That’s not there. What order were they indexed? That’s not there. And is that order of index reflective in the ranking? That’s not there.
So it’s a little bit hard to say that a 5 page website is outranked by a 100 page website, purely based off of its page count. So this is something that definitely needs to be retested. It needs to be done so that each and every site has the exact same page, is not the case. So we eliminate that and then we set the canonicals, and we find a way to submit all 3 of those test pages, the page that we’re trying to rank, to the index at the exact same time, using the exact same tools, and then monitor when do they get indexed, when are they ranked, and does their order of rank change based off of the time that they’ve indexed? And then we’ll have a better clear answer towards the idea of a 100 page site outranking a 5 page site.