When pages go supplemental (into a secondary index in search engines) all the search engine ranking factors associated with the on page elements are nullified.
Using SEO to prevent your pages from going into the supplemental index requires that each page has something significant enough to distinguish itself in from other pages within your website.
Considering that storage costs money and cloud data (pulling segments of data from multiple computers) means thousands of computers are consolidated as massive data centers working around the clock to deliver relevant search results.
Search engines use both the reputation and uniqueness of the content to assign a grade on the page level. In other words the same old new same “cookie cutter page” is not enough to impress search engines.
Add the costs of electricity and server maintenance to the equation and unless your content (1) makes the grade or (2) gets enough search volume to signify a degree of popularity chances are it will go supplemental and reside in the secondary index (where only a long-tail variant will evoke its presence).
Why should you care?
If rankings are a concern, although they are by the page, not the website; all pages individually are not capable of achieving first page search engine rankings (without help from a second level push).
This second level push implies unleashing the power of collective pages unified with intent to produce significant ranking factors which you can then channel to the appropriate area of your website.
You can consolidates themed pages, site architecture, titles and naming conventions or virtual theming (a technique involving layers of keyword laden internal links) to create viable shifts in rankings for specific pages with specific keywords if you have a tangible goal.
For example:
1st level
Navigation – using broad “more competitive” keywords in the primary navigation aid the process of creating enough term frequency to anchor the topical theme. The more a keyword appears across multiple pages, the more that keyword is considered crucial to the theme of the site.
In other words, you should chip away at the most competitive keywords by using them as keyword-rich naming conventions when possible. So, instead of services, you might add [your main keyword] services instead.
Do this enough with enough metrics that search engines use to assess relevance and the collective signals will toggle a higher degree of on page continuity and boost your relevance score.
Secondary navigation – Using secondary navigation can funnel link flow to subfolders or hub pages and act as a tiered structure that only link to relevant subpages based on where you are in the website.
For example, if I have 10 subfolders off the root folder to target two word competitive keywords www.mywebsite.com/main-keyword1/ then the way that you link back to the root folder will sculpt how that root folder is interpreted by search engines (since the pages in the root or one tier away typically carry more weight for rankings).
By using a custom sidebar, block quotes (with links to relevant pages in that subfolder) you can use links from secondary navigation (like the buddy system) to elect and elevate critical landing pages (essentially making them as strong as your home page).
Each page in the first level should have a sitemap for that folder (so spiders don’t have to rely on primary navigation). This way, you can also link to your sitemaps from somewhere contextually in the document or in a worst case scenario, link from the footer to unify the collective power of that subfolder and its designated purpose in the SERPs (search engine result pages).
2nd level
Breadcrumbs- breadcrumb navigation leaves a trail to signify how far you are away from the root. This is a great way to (a) aid navigation so that a visitor knows their bearings as well as (b) helping search engines catalog your pages in a systemic fashion.
Home> Main Keyword Subfolder1> Product Page1>
*Is an example of breadcrumb navigation.
Any text being used in an anchor (as a link) is a clear signal to search engines that for more information on this topic “follow the link”. So, use consistency to reinforce more competitive keywords (by linking to a page with keywords you want to rank for).
Areas that have a tendency for duplicity or duplicate content (like a re-occurring sidebar, main navigation, etc.) do add significant ranking factor, but they are weighted less than keyword-rich anchor text (a link) higher up in the body text, contextually surrounded by other related text about the topic contained in the link.
Search engines use proximity to other modifiers and various other metrics to interpret which pages, products or links are the most important, but by theming your links, you give it the first impression (on page) which eventually aids creating the appropriate synergy for off page ranking factor.
Alt Attributes – Images that occur throughout a website provide an opportunity to reinforce the generic (top level navigation). Images using the alt attribute equate as links, so they can be tactfully woven into a page without throwing off the topic on that page and the alt (alternative attribute) is treated as anchor text.
If you have a shopping cart with 3000 items and you are not using your alt attribute to consolidate ranking factor, you are omitting the opportunity to add a vital layer to your SEO.
Each image with an alt attribute is a link, if you use different modifiers in those alt attributes, you can create term weights to support anchor text, navigation, sitemaps and inbound links to send a clear signal of that pages ranking intent.
You can achieve a ranking for a page (depending on the authority of the site) from a title tag, meta tag or alt attribute alone. Each metric aids the next, so, use them wisely to consolidate a reputable and relevant internal site architecture.
Combine them with other SEO techniques (like sitemaps, navigation, contextual linking, rss feeds to build backlinks, etc) and you can push aside competitors in the SERPs who omitted adding layers to their SEO.
With over 200 metrics being assessed and cross-referenced, each layer you fine-tune can push your page or site over the tipping point when search engines tally the score.
3rd level
The first level (navigation and site architecture) and second level tiered approach (breadcrumbs, alt attributes, sitemaps ) are usually enough when coupled with proper titles, meta data and internal links. However sometimes you need more to make an impact.
The third level push
Deep link percentages – Deep links are links to a specific page and are the most significant way to alleviate the tendency for pages to go supplemental.
If a page receives more than 5 deep links from external websites, then it is typically considered “important enough” to appear in the top 1000 results as a contender for the context of the on page factors (what you say the page is about) and the off page factor (what others that link to you say about that page).
The first step is to get a page on the grid (in the top 1000) from there to break the top 100, then once it appears there, you can fine tune the page (by adding content, refining the contextual linking from other pages, tweak the titles or build additional inbound links) until the page is strong enough to topple the keyword in question.
An extremely competitive term may require thousands of deep links to rank. Each page has its own unique threshold depending on the industry, the size of the index (of that keyword or related keywords) as well as what the competition is like for the top 1000 results.
Deep linking is the method of choice for larger websites that have a tendency to exceed the ability to push link flow deeper into the site from the root. Based on how you manage the percentages of inbound links to specific pages determines their pecking order in the hierarchy.
If you cannot get links to a page, then build links to your links or use internal links to funnel link flow to the sitemap in the subfolder (to feed all of the pages simultaneously).
This alone can give them enough “popularity” to keep them out of the supplemental pages in search engines and thereby keep all of the SEO ranking factors present on those pages working on your behalf.
Very interesting post, thanks. I know of someone’s website that for a while did not ‘exist’ on the web! Fortunately things have slowly started to improve on the site.
Trying to tackle this right now thanks.
Does Google’s supplementary index still exist? They used to label anything that went into the supllemental index to indicate in the search results that it was in there, but they haven’t done this for a long time now. Hasn’t the seperate supplemental index been scrapped and rolled into the main index?
Your points are wonderfully made. I love how you discussed it and it really helped me understand how the rankings work. Thanks for the information.