I recently wrote a post about what happens as a result of the SEO process when a website crosses the tipping point (where internal momentum is unleashed) and the website stems and rank for multiple related keywords.
I would like to explore that process a bit more and elaborate on which strategy is more suitable for rankings, a niche site or an authority site. The extent of which can be determined by the competition, the amount of time you have at your disposal to accomplish the said goal and what type of performance indicators are at the threshold.
The Cycles of Momentum
When pages are continually referenced from (a) its own internal links or (b) known hubs or authorities they either become a hub or an authority site itself. The continuity between pages dripping with keyword prominence and topical relevance are distilled from each page as it is rightfully sorted (like an index in a book) to deliver the most likely landing pages to visitors searching for solutions using search engines.
A strong polarizing effect occurs as sites grow in magnitude. Having more pages equate to a higher propensity of probability squared by the number of possible keyword combinations of long-tail searches for each page.
The Metric of Popularity
Like a democracy, if you create enough references to a website from other sites, something phenomenal occurs, it ranks higher based on the merit of momentum. Through channeling link flow (link flow being the history and present tense aggregate link potency), vital cross-sections overlap and through a process similar to cross-pollination between the websites data / DNA a hybrid is born.
Educational / Commerce Hybrids
A hybrid site could either be a powerful mini-site (20-50 pages) and target a specific range of keywords or you could take the authority approach and snatch up all of the low hanging fruit from 100 to 10,000,000 competing pages (the only real difference is approach).
Regardless of the strategy, it is common to reinforce the theme (related content) with frequent updates to stave off competition. Websites are constantly in a state of flux, the ebb and tide can wash new keywords up on the shore of your pages at any time.
The ebb and flow can also wane and allow even your most secure keywords to diminish. Considering the stance that news sites and aggregators have the need to rise higher in the SERPs (to reach a larger audience) due to their time-sensitive nature.
Websites that update frequently have the algorithmic profile of a news / syndication source and can balance multiple layers of keywords to target consumers. The message here is, don’t get lazy at the top or you won’t be there long due to sites closing the gap using frequent updates and expansive content.
Broadening the Funnel
Rankings are a metric, each keyword has a set value of traffic it can attract, either organically or through sponsored means (like SEM and PPC). Regardless of if the traffic is broad match and “general” or exact match and “extremely relevant”, keywords matter and since keywords are a default of content, your on page strategy is the fulcrum.
Just like a cycle, the more pages that gain authority, the more they transfer that authority back to the domain to conquer even more competitive keywords. Wikipedia uses this method of self referral and cross-linking frequently to devour entire sections of competitive keywords. Who is to say that you cannot use the same method on a smaller scale to capture significant money phrases.
Planning the Harvest
The analogy of planting seeds (keywords) for future harvest (rankings) in Organic SEO draws interesting parallels. The distinction in this instance, if your site bags a keyword that has real search volume, it is not uncommon to have over 25-50% of your websites traffic result from that keyword.
Its better to sell traffic than buy it, and if your site starts to devour market share, then you have a bargaining chip for those who seek to purchase impressions with display adverting or no-follow links from relevant sources.
Depending on if your website is an education resource, a business or a hybrid, your on page and off page SEO tactics must accommodate the process to keep new phrases stemming from the semantic root. Deep links (linking to pages other than the homepage) for example from external sources cement those page as destinations for landing pages if reinforced by internal links from related (internal) pages.
Website Size Matters
So, do you build a site with thousands of pages to compete, or do you keep it manageable and create a high link density to offset the lack of range. The answer depends on what you are up against.
The quagmire is, if you develop a content rich site, you had better be prepared to feed it until it can support itself (6-8 months). Yet on the contrary, loading a page up with a high saturation of links (using a wide array of anchor text) can also produce a high ranking position for a narrow stint of keywords.
Once again, which strategy is best? In reality, it all depends, if you have some form of dynamism you can employ, such as a shopping cart with various items you can always use to incubate more powerhouse pages in 90 days (a pliable CMS system), then you can unite them later for a common goal. Spreading the range like a flashlight, then refocusing your link flow like a laser beam is possible with this strategy.
Or you could dig in to a topic, approach it from overlapping angles (like layers of an onion over time) and etch your path in the very topic itself (this is our approach). Conversion can be fine tuned and the more pages and content you have to work with, the greater your options.
However, you also have to know when to cap the sites focus at a certain point and when to recycle the authority to stay focused on a specific series of niche phrases that drive traffic. There is no easy answer or solution, just equally compelling arguments about building an authority site with thousands of pages or a site with a specific ranking objective in mind.