Search engines view each page in your website as a unique micro-site. As a result, each page can rank on its own accord if the appropriate signals are created.
Building strong relationships from page to page in your site is what produces a dominant domain in search engines. However before domain authority occurs, each page must be optimized for a specific series of key words and phrases if you want to pass that trait on to other relevant pages.
Once domain authority is produced, it is possible to see multiple pages ranking from one domain or a domain and sub-domain for multiple keywords (often with a double listing). The premise of SEO is to create the necessary balance of deep links in tandem with relevant content to cue search engines to rank specific pages higher as a result of the by-product of continuity.
Navigation is the primary means that lead to the discovery and indexing of your pages in search engines. Just like you must learn to crawl before you walk and walk before you run, taking your website through the necessary steps is crucial to develop true authority. This often involves building relevance for each page over time and then electing a champion for search engines.
Deep Link to Avoid Starving Your Pages
Aside from preliminary links from navigation or sitemaps, the way in which you link from page to page within your site (known as internal linking) creates pools or relevance for the keywords you use to link these pages. This is where deep links come into play in regard to optimizing each page for search engines.
Deep links can imply either internal links or external links being distributed throughout your site to pages other than your home page. The reason why this is important is, unless you are employing flat site architecture (just using the root folder for pages) then each sub folder you add significantly gets less link juice from the root folder.
To compensate for this, you can map out the internal links in a website to fortify pages with less link weight and distribution. Ideally, your main keywords should be targeted with your primary pages (homepage, or specific products or services page).
In the past footer links were sufficient to add enough context and weight to each page to aid search engine optimization. However, as search engine algorithms evolve to offset unnecessary relevance, the link location in addition to page focus and anchor text choices have varying degrees of impact.
Create Topical Pools of Relevance
For example a page that is about landing pages linking to another page about landing pages within your site has more relevance than say a page about pricing. The idea is to semantically weed out anything off topic and reinforce pages and folders within your site that have topical relevance.
Once a sufficient threshold has been reached (for each page from internal and external links) then that page or folder can rank competitively in search engines. In addition to that, any additional pages linked from those pages or within those folders also receive a significant boost of authority through link osmosis (ranking by affinity).
All it takes to appear competitively for a specific keyword or phrase is (a) a proportionately higher percentage of links to one page in a site from other pages or (b) enough internal and external links concentrating on a specific landing page. Here is an SEO tool we developed that identifies indexed pages in Google with continuity and keywords.
Don’t Cannibalize Keywords, Pick the Most Relevant Landing Pages
Although it is possible to cannibalize keywords (from diffusing the focus of a keyword by having too many pages competing for the topic in search engines) the idea is to elect a champion page or two, then support them with the subsequent pages.
For example, if I have 100 pages within my site, then that provides 99 opportunities to choose a variant on a keyword for 1 specific page. Even though I would not suggest all of those pages just to rank one keyword, typically 15-25 pages is enough. In this scenario, electing 25 pages for one keyword or a tight series of keywords and modifiers would be suitable. In addition to those 25 internal links, I would also consider building 25-50 links (from other relevant websites) to give that page the needed link weight to climb the search results.
So, if I had 100 pages to work with, that means I can rank 4 competitive terms (25 internal links per keyword to avoid keyword or content cannibalization) and roughly 10 secondary keywords for less competitive terms by editing those pages adding links with the right keywords in the appropriate locations.
Going back to the statement about the proximity and position of the link locations for internal links, the higher up on the page the links occur, the more credibility and link weight they provide for the target pages.
What occurs next is the precursor to developing true website authority. In addition to the pages with links all sharing collective buoyancy for the keywords involved, the added strength of using the collective composite of your web pages to reinforce topical relevance can create massive surges of rankings for secondary and tertiary keywords.
Search Engines Can Read, So Don’t Overdo It
Anytime search engines determine a high co-occurrence of keywords or semantic phrases from multiple pages in your website, it naturally assumes your site is targeting those phrases. All it takes to rank for a keyword is for the keyword to appear somewhere in the title, description of the meta data or within the body text of the page enough time that the relationship signals search engines that your page has relevance.
The more concentrated the internal or external links from other sites are to that page (from deep links) the easier it is for that page to gain traction. In addition to this, this is where the competition and their threshold of relevance and domain authority come into play.
Domain Age and Authority Equate to Rankings
Chronology (who started first) is weighted heavily in search engines, so, a website that started optimizing a series of phrases years in advance is going to be able to stave off competitors easier for those phrases than it will be for a newer website with less authority to acquire a high ranking for the same phrases.
The constant battle in search engines is not can you achieve a top ranking for a competitive keyword; it is more about how to acquire more authority for the market or niche than the competitor above you.
Even with a website that has authority (for other keywords and key phrases), it will still have to develop the right signals for search engines to acquire enough relevance and trust to shift the balance in favor of the new keyword targets, this can be done through deep linking and segueing internal links to produce additional trust and subsequent rankings.