in SEO by Jeffrey_Smith

With the mass amount of beta tests executed on search algorithms within the Googleplex, search results can often seem like “mixed-fruit” or a “tossed salad” for business owners trying to get a grip on where they rank and for which keywords.


Sifting Through Organic Search Engine Results

The purpose of this posts is to reveal search engine insights from the eyes of an SEO and showcase a few examples of ranking signals and their respective metrics in the search engine result pages.

While most companies may assume that consumers are the only ones searching on Google to determine search engine result page positioning, more often than naught it’s the competitors or placeholders obsessively checking search results to see if they gained or slipped a spot or two in Google, Yahoo or Bing.

The Google Shuffle

While we used to call this “the Google dance”, ranking algorithms have taken an abrupt shift in what is considered relevant, authoritative “or not”. For those who follow trends in search, the social graph is carrying more weight as far as the degree of ranking power a domain has.

For example, it used to be all about links, PageRank and domain age, then exact match domains “like ranking for red widgets” were the rave and difficult to eclipse; then for months on end, those who mastered on page optimization dominated and links were ignored or vice versa those with poor on page optimization triumphed just by using a bazillion links – in essence, Google’s organic ranking algorithms are a moving target that require constant tweaking to stay planted without vacillation.

From observing the latest sweep while survey the search landscape, our observations point to off page ranking factor and social signals carrying more weight than on page optimization, however, this can and will most surely change (as ranking algorithms are constantly being tested, altered and refined).

Balancing On Page Off Page Optimization Ratios

If you were a computer program trying to make sense of human intent and all the nuances it represents, you would need input signals which you could quantify as relevant indicators of significance. While Google does their best, to grossly oversimplify this process, relevance signals can be distilled by the percentage of on page to off page SEO.

Considering a sites authoritative value is a by-product of trust, being grandfathered into search results as the web imitates its index when there is a limited seed set of data (remember search engines consume and imitate the web) and what off page semblance or peer review exists for a page or website ultimately shifts the persona and influence of that page (or website).

In other words, data can be quantized, synthesized or graded. As a result, based on the ratios of the metrics used to make those calculations (algorithms) the sorting order of the search engine result pages is impacted based on such ratios.

A Hypothetical Formula / Relevance Signal

For example, if a page < 30 days old, has < 5 internal links and <3 backlinks from other websites (to validate its worth) then the relevance signal for (a) that page (b) the content / shingles (groups of words) and on page considerations such as internal links, h1 tag, tags and subject (c) the inbound links, the strength of those pages, their “relevance to the subject/keyword or key phrase” or their ability to influence the target page (as a link quality signal) is either amplified or dampened based on this algorithmic process of elimination or characterization.

This page under this assumption would correlate to a weak relevance signal or score of 0.001 a.k.a newbie without a snowballs chance at hitting page one for any of the keywords on the page or linking to it.

This type of calculation would instantly be aggregated as a fraction and placed into another array (which plays off other metrics which are calculated parsed and represented for other types of data extraction). But nonetheless, it does create a correlation of gravity, page strength or worth to the users parsing that index with queries.

Any change in (any of the metrics) impacts the relevance score, i.e. the page ages another 60 days, gets 25 internal links and has 50 links from other websites – would result in a much different score or ranking.

This is only one example of how a signal or metric when combined with 3 others impacts overall relevance score of a page (which impacts its ability to be returned as a relevant result for anything).

While these are constantly in flux, they do in fact leave a trail in their wake (through perturbation) which can be tracked, accounted for and anticipated for existing search engine optimization campaigns.

As a result of observance, testing and assessing the scope of the layers, organic ranking strategies are adjusted to provide maximum lift or create the necessary on page or off page signals required to cross the tipping point of relevance and gain “lift” or buoyancy in search results.

While the process sounds complex, it becomes intuitive for more skilled SEO’s to take the various signals from an optimization campaign and determine which ingredient is lacking (link velocity, on page relevance, content development, trust signals from lack of citation, site architecture, etc.) which can be corrected with its appropriate antithesis to correct granular inconsistencies.

Let’s run through a few typical SEO adjustments based on signals produced from the wake of search engine result page conditions:

Scenario A: If a page does not move from links alone, such as you have built dozens of links to a page and there is little to no improvement this could represent.

  • Link quality is a concern (bad neighborhood or the pages lack relevance signals to pass along any substantial long-term value).
  • The target page could lack enough internal link citation (build more supporting content based on related keywords in titles, links and tags, then link to the landing page).
  • The market is competitive (you are trying to supplant a competitor which is part of the authority set (meaning they have to do less to defend your ascent than you have to do to acquire or surpass their position).
  • Your website lacks the necessary link velocity (you are either building links too fast and incurring a slight penalization or too slow (compared to competitors) or potentially not building enough quality links to stave their defense.
  • The on page optimization or your website could be subpar (with multiple pages all fighting for the same ranking with no clear hierarchy (canonicalization, internal links, content relevance) in place to determine which page should rank for the keyword in question.
  • Too much noise in the template creating duplicate pages in the site from the internal pages lacking enough content or distinction (so you need to apply noindex, follow to less important pages).

First, I would add a set number of links to the page to determine where the point of buoyancy occurs (7 links, 12 links, 15 links) with a specific anchor text to see if you can “break it loose”, then I would assess the on page internal support system and ensure there is enough “dedicated content” to prop up the competitive keyword variation.

For a second tier push, build links to the other internal pages as well (3-5 links) to give the weakest pages in the daisy chain a boost (as well as allow them to be “validated” as a worthy page). Then, as those pages age, they gain PageRank, Link-Equity and Trust and can silently rank dozens of other keywords and landing pages.

Look at the competition and see if (a) their landing page is loaded with inbound links (b) if it has zero links and the domain trust is pushing the relevance factor or (c) if the page is full of duplicate content (same sidebar, same footer, same noise) then reduce the clutter, add more content (below the fold if you must) and get 10 more internal links from relevant pages within the site.

This type of analysis and correction should “break the page loose” and allow your optimization efforts to be far more conducive and efficient as a result.

While you may never understand the intricacies of Google internal weighting mechanism for which metrics are dominant, suppressed or favorable, you can find the basic levers you can push, tone-down or pull that will reel in a stronger relevance signal for your web pages.

Trial and error, heuristic testing and feedback are all aspects of organic search. SEO requires a constant vigil as well as observances based on best practices, past performance and real time feedback from competitors and search algorithms to be successful. 

Read More Related Posts
Google Algorithms Changes and SERP Watching
It is no secret to those involved in SEO that Google has recently rolled-out or is testing some newly enhanced features (for usability and design) as well as tweaks to ...
Search engine optimization (SEO) or Pay per click (PPC)? and which one is right for me? This is a question that thousands of business owners and marketing managers ask themselves ...
Choosing the Right Keywords for Commerce
Keyword research is the first step in any SEO (search engine optimization) campaign. This is the phase where you compile various critical keywords based on your vertical market, competitors, trend ...
Does Your 2009 Marketing Budget Include SEO?
As 4th quarter (Q4) earnings are tabulated, companies who experienced record growth as a result of SEO are eager to initiate their online marketing objectives for the first quarter of ...
Revisiting Essential SEO Strategies
Taking a page from the SEO playbook; have you ever wondered which SEO strategy is best, tiered site architecture with multiple nodes, flat site architecture (with everything in the root) ...
Man Vs. Machine: Machine Learning Citation and Relevance
In today's competitive SEO landscape between multimillion dollar content wars, automation and improved search engine filters, how does your website stand up to the competition? As a contingency of SEO, we ...
SEO Tips to Build a New Website
Here are 10 powerhouse SEO tips that you can use to build a new website and make sure it gets off to the right start. First, the strategy, then a ...
Is Using Keyword-Rich Domains a Solid SEO Strategy?
What are the strategic SEO advantages of having a relevant keyword-rich domain name? Is it a temporal search engine loophole or viable long-term ranking strategy? They say what is but a ...
The saying you can lead a horse to water, but can’t make them drink is the equivalent of bringing a visitor to a landing page through paid search, banners, contextual ...
SEO for Large Sites Part One
As your website grows, your grasp and understanding of SEO should also. The tendency for search engines is, the more content you have, the more keywords and topicality can overlap ...
Google Algorithm Changes and SERP Watching
Keyword Research is the Key to E-Commerce
Does Your 2009 Marketing Budget Include SEO?
Revisiting Essential SEO Strategies
Content Wars, SEO & the Google Caffeine Sandbox
SEO Tips to Build a New Website
Using Keyword-Rich Domains and Subdomains for SEO
Why Most Landing Pages Fail
SEO for Large Websites Part I

About Jeffrey_Smith

In 2006, Jeffrey Smith founded SEO Design Solutions (An SEO Provider who now develops SEO Software for WordPress).

Jeffrey has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and marketing strategies to increase rankings, revenue and reach.

3 thoughts on “SEO and Google Search Engine Result Pages
  1. Very useful and informative article.

  2. For me, the “Balancing On Page Off Page Optimization Ratios” are the most interesting topic discussed in this post. Rarely, SEO Expert and SEO Outsourcing companies would do the same thing. In most cases, they just want to gain links whether if it is inbound or outbound links. Balancing on page and off page are just few of the things we consider in a optimization progress. Data can be quantized, synthesized or graded with the used of analytics and some other SEO tools you have used before.

  3. Acky Mori says:

    Thanks for the Post! its very helpful for us.. I hope to read more articles from you..

Comments are closed.