in SEO by Jeffrey_Smith

With the mass amount of beta tests executed on search algorithms within the Googleplex, search results can often seem like “mixed-fruit” or a “tossed salad” for business owners trying to get a grip on where they rank and for which keywords.


Sifting Through Organic Search Engine Results

The purpose of this posts is to reveal search engine insights from the eyes of an SEO and showcase a few examples of ranking signals and their respective metrics in the search engine result pages.

While most companies may assume that consumers are the only ones searching on Google to determine search engine result page positioning, more often than naught it’s the competitors or placeholders obsessively checking search results to see if they gained or slipped a spot or two in Google, Yahoo or Bing.

The Google Shuffle

While we used to call this “the Google dance”, ranking algorithms have taken an abrupt shift in what is considered relevant, authoritative “or not”. For those who follow trends in search, the social graph is carrying more weight as far as the degree of ranking power a domain has.

For example, it used to be all about links, PageRank and domain age, then exact match domains “like ranking for red widgets” were the rave and difficult to eclipse; then for months on end, those who mastered on page optimization dominated and links were ignored or vice versa those with poor on page optimization triumphed just by using a bazillion links – in essence, Google’s organic ranking algorithms are a moving target that require constant tweaking to stay planted without vacillation.

From observing the latest sweep while survey the search landscape, our observations point to off page ranking factor and social signals carrying more weight than on page optimization, however, this can and will most surely change (as ranking algorithms are constantly being tested, altered and refined).

Balancing On Page Off Page Optimization Ratios

If you were a computer program trying to make sense of human intent and all the nuances it represents, you would need input signals which you could quantify as relevant indicators of significance. While Google does their best, to grossly oversimplify this process, relevance signals can be distilled by the percentage of on page to off page SEO.

Considering a sites authoritative value is a by-product of trust, being grandfathered into search results as the web imitates its index when there is a limited seed set of data (remember search engines consume and imitate the web) and what off page semblance or peer review exists for a page or website ultimately shifts the persona and influence of that page (or website).

In other words, data can be quantized, synthesized or graded. As a result, based on the ratios of the metrics used to make those calculations (algorithms) the sorting order of the search engine result pages is impacted based on such ratios.

A Hypothetical Formula / Relevance Signal

For example, if a page < 30 days old, has < 5 internal links and <3 backlinks from other websites (to validate its worth) then the relevance signal for (a) that page (b) the content / shingles (groups of words) and on page considerations such as internal links, h1 tag, tags and subject (c) the inbound links, the strength of those pages, their “relevance to the subject/keyword or key phrase” or their ability to influence the target page (as a link quality signal) is either amplified or dampened based on this algorithmic process of elimination or characterization.

This page under this assumption would correlate to a weak relevance signal or score of 0.001 a.k.a newbie without a snowballs chance at hitting page one for any of the keywords on the page or linking to it.

This type of calculation would instantly be aggregated as a fraction and placed into another array (which plays off other metrics which are calculated parsed and represented for other types of data extraction). But nonetheless, it does create a correlation of gravity, page strength or worth to the users parsing that index with queries.

Any change in (any of the metrics) impacts the relevance score, i.e. the page ages another 60 days, gets 25 internal links and has 50 links from other websites – would result in a much different score or ranking.

This is only one example of how a signal or metric when combined with 3 others impacts overall relevance score of a page (which impacts its ability to be returned as a relevant result for anything).

While these are constantly in flux, they do in fact leave a trail in their wake (through perturbation) which can be tracked, accounted for and anticipated for existing search engine optimization campaigns.

As a result of observance, testing and assessing the scope of the layers, organic ranking strategies are adjusted to provide maximum lift or create the necessary on page or off page signals required to cross the tipping point of relevance and gain “lift” or buoyancy in search results.

While the process sounds complex, it becomes intuitive for more skilled SEO’s to take the various signals from an optimization campaign and determine which ingredient is lacking (link velocity, on page relevance, content development, trust signals from lack of citation, site architecture, etc.) which can be corrected with its appropriate antithesis to correct granular inconsistencies.

Let’s run through a few typical SEO adjustments based on signals produced from the wake of search engine result page conditions:

Scenario A: If a page does not move from links alone, such as you have built dozens of links to a page and there is little to no improvement this could represent.

  • Link quality is a concern (bad neighborhood or the pages lack relevance signals to pass along any substantial long-term value).
  • The target page could lack enough internal link citation (build more supporting content based on related keywords in titles, links and tags, then link to the landing page).
  • The market is competitive (you are trying to supplant a competitor which is part of the authority set (meaning they have to do less to defend your ascent than you have to do to acquire or surpass their position).
  • Your website lacks the necessary link velocity (you are either building links too fast and incurring a slight penalization or too slow (compared to competitors) or potentially not building enough quality links to stave their defense.
  • The on page optimization or your website could be subpar (with multiple pages all fighting for the same ranking with no clear hierarchy (canonicalization, internal links, content relevance) in place to determine which page should rank for the keyword in question.
  • Too much noise in the template creating duplicate pages in the site from the internal pages lacking enough content or distinction (so you need to apply noindex, follow to less important pages).

First, I would add a set number of links to the page to determine where the point of buoyancy occurs (7 links, 12 links, 15 links) with a specific anchor text to see if you can “break it loose”, then I would assess the on page internal support system and ensure there is enough “dedicated content” to prop up the competitive keyword variation.

For a second tier push, build links to the other internal pages as well (3-5 links) to give the weakest pages in the daisy chain a boost (as well as allow them to be “validated” as a worthy page). Then, as those pages age, they gain PageRank, Link-Equity and Trust and can silently rank dozens of other keywords and landing pages.

Look at the competition and see if (a) their landing page is loaded with inbound links (b) if it has zero links and the domain trust is pushing the relevance factor or (c) if the page is full of duplicate content (same sidebar, same footer, same noise) then reduce the clutter, add more content (below the fold if you must) and get 10 more internal links from relevant pages within the site.

This type of analysis and correction should “break the page loose” and allow your optimization efforts to be far more conducive and efficient as a result.

While you may never understand the intricacies of Google internal weighting mechanism for which metrics are dominant, suppressed or favorable, you can find the basic levers you can push, tone-down or pull that will reel in a stronger relevance signal for your web pages.

Trial and error, heuristic testing and feedback are all aspects of organic search. SEO requires a constant vigil as well as observances based on best practices, past performance and real time feedback from competitors and search algorithms to be successful. 

Read More Related Posts
Targeting Keywords Within Your Reach
There is something to be said about targeting terms and keywords that are within your reach. Not that wishful thinking does not have its place, but before you embark ...
SEO Tips to Improve Your SEO Score
What if you have a page in your website and for whatever reason it is completely ignored by search engines? What actionable intelligence can you use to employ as a ...
Are You Ignoring Other Types of Traffic?
For those involved in internet marketing, the need to be aware that the vast umbrella of relevance and the medium of user engagement is constantly expanding and SERP traffic (search engine ...
Ignore the Long Tail at Your Own Risk
We were contacted by a company touting a top level domain from the 90's, the kind of domain name you can only dream about acquiring for less than $50K-70K in ...
Glitch? New Algorithm? Amnesia? Who Knows...
What happened to the search results for SEO today around 1PM CST? I think at this point, only the Google engineers responsible for hitting the “scramble results for kicks button” ...
On Page Optimizations
On-page optimizations On Page SEO requires a specific type of tact. Elements of each page, both visible and embedded in the code, can help search engines understand the content of the ...
Google PageRank Update December 2009
Last night, Google has passed out a little page rank love with a recent update to kick off the new Google caffeine (bigger, faster, smarter) search engine initiative. For many of ...
Addicted to SEO
I have heard the connotation that PPC (pay per click marketing) is like crack, that once you start you can’t stop using it, and for many businesses rightly so. In all ...
Viral marketing is nothing new, but as all things SEO and web 2.0 it takes time to rise above the antiquated methods of the past from a traditional advertising perspective. ...
Getting Traffic is Not Enough...
Many people who enlist for SEO services are thrilled to see that their traffic numbers are higher than ever before. But then they are quickly puzzled to see that their ...
Targeting Keywords Within Your Reach
How to Improve your Pages SEO Score
SEO and (SERP) Traffic is Only One Type
Ignore the Long Tail at Your Own Risk
Search Results for SEO Experience Technical Difficulties
SEO Tips for On Page Optimizations
Happy New Year’s Page Rank Update
Addicted to SEO
Link Baiting, Viral Marketing & Creating a Web

About Jeffrey_Smith

In 2006, Jeffrey Smith founded SEO Design Solutions (An SEO Provider who now develops SEO Software for WordPress).

Jeffrey has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and marketing strategies to increase rankings, revenue and reach.

3 thoughts on “SEO and Google Search Engine Result Pages
  1. Very useful and informative article.

  2. For me, the “Balancing On Page Off Page Optimization Ratios” are the most interesting topic discussed in this post. Rarely, SEO Expert and SEO Outsourcing companies would do the same thing. In most cases, they just want to gain links whether if it is inbound or outbound links. Balancing on page and off page are just few of the things we consider in a optimization progress. Data can be quantized, synthesized or graded with the used of analytics and some other SEO tools you have used before.

  3. Acky Mori says:

    Thanks for the Post! its very helpful for us.. I hope to read more articles from you..

Comments are closed.