in SEO by Jeffrey_Smith

With the mass amount of beta tests executed on search algorithms within the Googleplex, search results can often seem like “mixed-fruit” or a “tossed salad” for business owners trying to get a grip on where they rank and for which keywords.


Sifting Through Organic Search Engine Results

The purpose of this posts is to reveal search engine insights from the eyes of an SEO and showcase a few examples of ranking signals and their respective metrics in the search engine result pages.

While most companies may assume that consumers are the only ones searching on Google to determine search engine result page positioning, more often than naught it’s the competitors or placeholders obsessively checking search results to see if they gained or slipped a spot or two in Google, Yahoo or Bing.

The Google Shuffle

While we used to call this “the Google dance”, ranking algorithms have taken an abrupt shift in what is considered relevant, authoritative “or not”. For those who follow trends in search, the social graph is carrying more weight as far as the degree of ranking power a domain has.

For example, it used to be all about links, PageRank and domain age, then exact match domains “like ranking for red widgets” were the rave and difficult to eclipse; then for months on end, those who mastered on page optimization dominated and links were ignored or vice versa those with poor on page optimization triumphed just by using a bazillion links – in essence, Google’s organic ranking algorithms are a moving target that require constant tweaking to stay planted without vacillation.

From observing the latest sweep while survey the search landscape, our observations point to off page ranking factor and social signals carrying more weight than on page optimization, however, this can and will most surely change (as ranking algorithms are constantly being tested, altered and refined).

Balancing On Page Off Page Optimization Ratios

If you were a computer program trying to make sense of human intent and all the nuances it represents, you would need input signals which you could quantify as relevant indicators of significance. While Google does their best, to grossly oversimplify this process, relevance signals can be distilled by the percentage of on page to off page SEO.

Considering a sites authoritative value is a by-product of trust, being grandfathered into search results as the web imitates its index when there is a limited seed set of data (remember search engines consume and imitate the web) and what off page semblance or peer review exists for a page or website ultimately shifts the persona and influence of that page (or website).

In other words, data can be quantized, synthesized or graded. As a result, based on the ratios of the metrics used to make those calculations (algorithms) the sorting order of the search engine result pages is impacted based on such ratios.

A Hypothetical Formula / Relevance Signal

For example, if a page < 30 days old, has < 5 internal links and <3 backlinks from other websites (to validate its worth) then the relevance signal for (a) that page (b) the content / shingles (groups of words) and on page considerations such as internal links, h1 tag, tags and subject (c) the inbound links, the strength of those pages, their “relevance to the subject/keyword or key phrase” or their ability to influence the target page (as a link quality signal) is either amplified or dampened based on this algorithmic process of elimination or characterization.

This page under this assumption would correlate to a weak relevance signal or score of 0.001 a.k.a newbie without a snowballs chance at hitting page one for any of the keywords on the page or linking to it.

This type of calculation would instantly be aggregated as a fraction and placed into another array (which plays off other metrics which are calculated parsed and represented for other types of data extraction). But nonetheless, it does create a correlation of gravity, page strength or worth to the users parsing that index with queries.

Any change in (any of the metrics) impacts the relevance score, i.e. the page ages another 60 days, gets 25 internal links and has 50 links from other websites – would result in a much different score or ranking.

This is only one example of how a signal or metric when combined with 3 others impacts overall relevance score of a page (which impacts its ability to be returned as a relevant result for anything).

While these are constantly in flux, they do in fact leave a trail in their wake (through perturbation) which can be tracked, accounted for and anticipated for existing search engine optimization campaigns.

As a result of observance, testing and assessing the scope of the layers, organic ranking strategies are adjusted to provide maximum lift or create the necessary on page or off page signals required to cross the tipping point of relevance and gain “lift” or buoyancy in search results.

While the process sounds complex, it becomes intuitive for more skilled SEO’s to take the various signals from an optimization campaign and determine which ingredient is lacking (link velocity, on page relevance, content development, trust signals from lack of citation, site architecture, etc.) which can be corrected with its appropriate antithesis to correct granular inconsistencies.

Let’s run through a few typical SEO adjustments based on signals produced from the wake of search engine result page conditions:

Scenario A: If a page does not move from links alone, such as you have built dozens of links to a page and there is little to no improvement this could represent.

  • Link quality is a concern (bad neighborhood or the pages lack relevance signals to pass along any substantial long-term value).
  • The target page could lack enough internal link citation (build more supporting content based on related keywords in titles, links and tags, then link to the landing page).
  • The market is competitive (you are trying to supplant a competitor which is part of the authority set (meaning they have to do less to defend your ascent than you have to do to acquire or surpass their position).
  • Your website lacks the necessary link velocity (you are either building links too fast and incurring a slight penalization or too slow (compared to competitors) or potentially not building enough quality links to stave their defense.
  • The on page optimization or your website could be subpar (with multiple pages all fighting for the same ranking with no clear hierarchy (canonicalization, internal links, content relevance) in place to determine which page should rank for the keyword in question.
  • Too much noise in the template creating duplicate pages in the site from the internal pages lacking enough content or distinction (so you need to apply noindex, follow to less important pages).

First, I would add a set number of links to the page to determine where the point of buoyancy occurs (7 links, 12 links, 15 links) with a specific anchor text to see if you can “break it loose”, then I would assess the on page internal support system and ensure there is enough “dedicated content” to prop up the competitive keyword variation.

For a second tier push, build links to the other internal pages as well (3-5 links) to give the weakest pages in the daisy chain a boost (as well as allow them to be “validated” as a worthy page). Then, as those pages age, they gain PageRank, Link-Equity and Trust and can silently rank dozens of other keywords and landing pages.

Look at the competition and see if (a) their landing page is loaded with inbound links (b) if it has zero links and the domain trust is pushing the relevance factor or (c) if the page is full of duplicate content (same sidebar, same footer, same noise) then reduce the clutter, add more content (below the fold if you must) and get 10 more internal links from relevant pages within the site.

This type of analysis and correction should “break the page loose” and allow your optimization efforts to be far more conducive and efficient as a result.

While you may never understand the intricacies of Google internal weighting mechanism for which metrics are dominant, suppressed or favorable, you can find the basic levers you can push, tone-down or pull that will reel in a stronger relevance signal for your web pages.

Trial and error, heuristic testing and feedback are all aspects of organic search. SEO requires a constant vigil as well as observances based on best practices, past performance and real time feedback from competitors and search algorithms to be successful. 

Read More Related Posts
Ignore the Long Tail at Your Own Risk
We were contacted by a company touting a top level domain from the 90's, the kind of domain name you can only dream about acquiring for less than $50K-70K in ...
SEO and Deep Linking
Most pages in your site are limited to being placeholders due to lack of external link weight. After your internal links are SEO friendly, it’s time to shift the focus ...
Hmmm, Smells like a Sponsored Link to Me...
In the competitive realm of SEO, honest business people are often tempted to resort to less-than-honest forms of internet marketing. We’ve all heard the rumors about competitors clicking on your ...
SEO Marketing, Authority and Scalability
From the perspective of SEO, each domain requires a different type of tact, depending on where it is in the cycle of developing authority. However, SEO is only one channel ...
How does (internal or external) link discovery affect SEO?
Have you ever wondered how link discovery time affects your pages SERP (search engine result page) position and SEO? In a previous post named SEO and the Cycles of Optimization, ...
Keyword Relevance and SEO
Is it better to focus on keywords or on market share instead? If your website has a dominant market position, the keywords visitors use to find your website are irrelevant. ...
SEO Ultimate Version 1.2 Released
SEO Ultimate, the powerful WordPress SEO plugin from SEO Design Solutions, has just been upgraded with a new Competition Researcher module that provides you with 3-step access to the investigative ...
Are there any SEO Secrets?
This post questions the notion about secret search engine optimization methods and SEO secrets.  Are there SEO secrets or is SEO just a process of layering the fundamentals to produce ...
Synchronizing SEO Metrics
There are various metrics in SEO that contribute to producing aggregate ranking factors that when synchronized properly produce high ranking search engine positions. On page SEO represents 60% of that ...
Screen Capture from SEM Rush
Let’s face it, there are only 2 kinds of rankings for SEO, (1) the ones you meant to do on purpose and (2) the rankings that appeared unintentionally that still ...
Ignore the Long Tail at Your Own Risk
SEO and Deep Linking
SEO Links: The Way They Were Meant To
SEO Marketing: Social Structure and Scalability
How Link Discovery Time Affects SEO
Keyword Relevance and SEO
SEO Ultimate WordPress SEO Plugin Version 1.2 Released
Are There Any Real SEO Secrets?
Synchronizing SEO Metrics
How to Use SEMRush to Monetize Click Through

About Jeffrey_Smith

In 2006, Jeffrey Smith founded SEO Design Solutions (An SEO Provider who now develops SEO Software for WordPress).

Jeffrey has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and marketing strategies to increase rankings, revenue and reach.

3 thoughts on “SEO and Google Search Engine Result Pages
  1. Very useful and informative article.

  2. For me, the “Balancing On Page Off Page Optimization Ratios” are the most interesting topic discussed in this post. Rarely, SEO Expert and SEO Outsourcing companies would do the same thing. In most cases, they just want to gain links whether if it is inbound or outbound links. Balancing on page and off page are just few of the things we consider in a optimization progress. Data can be quantized, synthesized or graded with the used of analytics and some other SEO tools you have used before.

  3. Acky Mori says:

    Thanks for the Post! its very helpful for us.. I hope to read more articles from you..

Comments are closed.