When it comes to SEO and search engines, cache is king. One of the most simple metrics of measuring the wealth and authority of a web property is how many pages are indexed and whether or not preferred landing pages are established.
However, before visitors can arrive, search engines need to find, spider, index and rank your pages for relevance. This means you should take every opportunity to increase indexation through creating viable link structures based on a hierarchy of importance for keywords or landing pages.
This is why it is important to match what people think is important with what search engines see as important. This is accomplished through using a structured approach to navigation and internal linking.
You can have thousands of pages in a website, but if they are not connected by a common thread or are not linked properly, then chances are search engines may never discover them.
Using the metric of deep links and approaching SEO from the premise of inbound links to page and outbound links leaving a page, what remains in between is what determines how much vital ranking factor that page has to pass along to other pages in the website.
Don’t squander links, make each link count by carefully mapping out a hierarchy of keywords that you intend to create SERP positioning for and then performing an internal site audit and identify pages which could leverage more internal link flow for a preferred landing page.
For example, if I have a website based on a particular product or niche and the content on the pages is a little light, then I could always add a blog and leverage the blog by creating content-rich pages to facilitate internal links to the languishing pages suffering from link attrition.
We often use a 65% / 35% ratio for deep linking within a site to promote the preferred page hierarchy within a web property, however what ratio you use is up to you. This means that 65% of the links should go to the homepage from other websites so the homepage can then act as a “catch all” to funnel link flow back to other vital areas of the website.
And 35% of the deep links (links to other pages other than the website) should target specific pages with an array of anchor text (based on the ideal keywords those pages are to appear for).
We refer to this as “keyword clusters” which are semantically aligned keywords based on a root phrase that have synonymous or semantic shingles (groups of words) that overlap (based on the root keyword or phrase).
Once you map out the ranking objective, you can determine the needs of a page to calculate the thresholds needed to produce buoyancy. The selected page must exhibit the proper proportion of internal links from other pages in the site in order to communicate to search engines that this page is important.
Without Googlebot, Slurp and Bing’s web crawler’s spidering your content, the rankings are a moot point, unless you have other ways to promote those pages (such as using other websites with high traffic, rankings or authority).
The ratio of internal linking is a preference which depends on three things:
- If all of your pages are intended to get indexed.
- If you have established a primary and secondary landing page for your targeted keywords.
- How many pages you are willing to create, edit or optimize to produce the most conducive internal link ratio.
Going back to SEO basics, if I have a 500 page website and my main keyword is competitive, chances are there will be overlapping shingles (groups of words) which represent link opportunities from 75% of the pages within the site.
This means I can carve out some internal link leverage which I can point at the page of choice, then carefully select how many outbound links that page has, and how to spend that link equity on other pages with in turn can rank additional pages and so on.
After I leverage the internal links (from linking from the keywords as they occur on pages to the preferred page intended to rank for those keywords) the target page is identified as the ideal source for search engines to serve to the masses for all things related to that keyword or keyword cluster.
It’s not rocket science, however it is very effective for producing rankings. Wikipedia is best known for this SEO technique whereby they use laser-like titles, flat site architecture and URL naming conventions (all based on a specific keyword) then consolidate all other keyword occurrences from other pages within the site to that page (when that keyword occurs).
Add to the mix that other people constantly link to those pages (due to their authority pushing them to the top of the SERPs) it is a beautiful dynamic cycle of their domain authority elevating their pages, which in turn attract deep links and cement their position in search engines.
This tactic, if implemented properly results in a website replete with connectivity that passes along spiders, PageRank, visitors and allows each page to “break out” of the mold and become a dynamic asset to your collective ranking strategy.
Even if used moderately and you audit a site specifically to cap outbound frivolous links, that link equity concentrates on the links that exist on the page. Depending on how thorough you are determines how much link flow gets past along within though outbound/page internal links.
There is no point in creating content if it is orphaned (like an island with no bridges). Use links to the site from other websites as a way to create the appropriate referential integrity, but use internal links to (a) reinforce that integrity and (b) determine which pages are most suited to convert.