Google’s Loves Affair With Authority Sites

In short, Google’s love affair with authority sites is based on the premise of reciprocity, relevance and commercial viability. It’s no secret, that Google’s search engine algorithms are stacked in favor of quality content and authoritative websites.

Google Loves Authority Sites

Google's Love Affair with Authority Sites

Protecting the integrity of that love affair can have harsh consequences for those on the fringes of gray areas administered as Google updates or corrections to their index.

One must consider, that if a higher echelon of quality presides in Google’s index from Google being more selective or downright finicky about “which types of websites are algorithmically endowed” or human rater approved, the more relevant and / or commercially viable Google’s dominion becomes when it’s time to monetize that traffic.

This is a post covering two glaring and mutually important topics, the first being how search engine algorithms can dramatically impact your business through “corrections” designed to improve quality control and relevance and the other part leaning towards what you can do to insulate yourself from changes to algorithms by creating an authoritative online presence.

What Constitutes Search Engine Quality Control?

While quality control may be an illusive and subjective qualification, there is a need for identifying and corralling how websites rank in the index. Statistics indicate that for every 10 people that conduct a search in a search engine, 2 of those people click paid “sponsored” results.

Considering Google is compensated for those sponsored results, this represents a multi-billion dollar industry. So, if the organic/natural results suffer, Google suffers as users might opt to use another search engines in an attempt to find relevant results.

This is the premise of changes to algorithms in order to (a) level the playing field and (b) obfuscate and long-term trend or loophole which could allow people to game the algorithm indefinitely and insert sub par content, spam, etc. and further constitute decay of that quality, i.e. the user experience.

The problem for Google stems from their popularity based on their search engine market share. Anyone involved in online business knows the importance of ranking in the top 3 results in Google (for any keyword with a relevant baseline of traffic).

Especially if that ranking is monetized properly through advertisements, Adsense, a strong call to action or value proposition with a CPA (cost per acquisition) offer, physical product, service offered, etc. If there is a conversion attached, the more traffic one receives, the more potential sales can occur and this is the cheese for most marketers (particularly if they know the distinction between commercial and educational queries that separates the buyers from the browsers).

As long as the value proposition (of what the user gets) equates to something equal to or exceeds their expectations of commercial intent “of what they searched for in the first place”. This opens Google to being the number one target for spam and / or sub par content to procure rankings for landing pages in the organic search algorithm, which leads to the next point.

Action / Reaction: When Push Comes to Shove

When people push the boundaries of content relevance in lieu of commercial gain, then its only a matter of time before search engines push back. Earlier this month the web (at least what is indexed by Google) experienced a push back as a recent chop block of ranking shifts were impacted by the Google Panda/Farmer Update which targeted websites that churned out lackluster or duplicate content.

These websites were normalized, i.e. suppressed for their ability to rank for lucrative, mid-tail and traffic bearing keywords. while most would not notice (unless their websites were caught in the mix), just consider “the correction” the latest response to the old, yet transparent concept of the – “if you look good, we look good” paradigm for Google, while meanwhile businesses who thrived on natural search engine positioning are facing deficit as result of these shifts.

Insulating Your Website from Algorithm Shifts

The only way to “insulate your website” from vacillations in the search engine result pages is through creating an authoritative presence as insulation takes on new meaning in this context.

Those who have invested the time, energy and effort of crafting stellar content experience the rewards of mid-tail, long-tail and exact match traffic for keywords that lesser sites envy. The suggestion of an SEO ceiling or “where your website can appear within minutes or hours of creating a page, without backlinks” is one indication of the potency or website authority of your domain.

As an example, a new website should target keywords with less than 50,000 competing pages in “phrase match” (check to see how competitive your key phrase is by placing it “in quotes” in a Google search to see estimated competing pages) in order to weave a tight ring of relevant content around a topic (3-5 posts) and rank those pages (by consolidating title tags, and internal links) to get ranked with minimal effort.

Continue to repeat this process enough and with enough frequency (integrating additional keywords) from a cluster of keywords based on research (and your website can strategically devour and rank for the entire cluster of keywords).

As a website scales in size from 10 pages to 50, 50 to 200, 200 to 500, 500 to 1000 pages it will start developing the traits which distinguish it from other websites. These are the trappings of authority which are the ultimate objective of SEO.

Once developed, these characteristics create a number of noticeable benefits. One benefit of creating a robust website with expertly written content is rapid indexation and the ability to appear directly in search engines as if the filters let you pass and provide VIP access directly to the top 10 in minutes vs. days, weeks or in some cases (with poorly crafted websites months).

While the benefits of indexation may not seem significant initially, if you are targeting a more competitive keyword, it is this process of layering that produces a compound effect that eventually tips the scales of relevance in your favor (as your website spawns more instances of semantically related keywords each one is like a piece that unlocks the ranking puzzle, adding more “term weights” to the websites relevance vector).

As simple as it sounds, just by adding structured, relevant content systematically over time in tandem with providing or dripping deep links (links from other websites) rankings are produced.

The more “natural” the links i.e. from related sites in related niches with Page rank or “pages that themselves rank for some variation of the keyword in Google, Yahoo or Bing” pass on the algorithmic trust signals (an equally important signal) to the target pages to prop them up and create buoyancy in search engines “allowing pages float to the top with ease”.

When you consider the model that search engines are predicated on, the more relevant and useful the websites that rank at the top, the more relevance and regard the search engine receives for delivering a visitor to that website.

This algorithmic “it factor”, which in this case is based on a corpus of documents, can often translate into a positive subjective user experience for users in search of solutions and attract links or create more citation (links, mentions, shares), brand loyalty and conversions.

This is the sole reason why the simple rule of thumb that suggests – the better the search results are, the more users will continue to use that search engine, which means the more opportunity that search engine (Google, Yahoo, Bing) has to track, monitor and monetize those users.

This is why developing an authority presence in Google or other search engines is one of the most economical investments you can make and all that is required is proper selection of keywords, creative integration of those keywords into titles and articles and internal links from those articles to relevant pages designed for conversion.

So, now that we have initiated the argument of why it is important to develop authority, what can you do to make sure that your website falls on the right side of the line of the algorithmic “it” factor?

Scale Your Website Using Natural Language Processing

Google pays attention to themed relationships, relationships of language, nuances and vector that pair synonyms, polynyms, keywords and modifiers that “when combined” on a page or occur frequently in a website create relevance.

For example, if you talk about animals, puppies, pets, dogs, pounds or shelters or sprinkle the words giveaway or for sale on a page and that page can rank for:

  • Puppies for sale
  • Dog giveaways
  • Get a free dog
  • Animal shelter free puppies

And all manner of stemmed keywords variations; this is the parsing effect of content being collected and placed in a bin (like a tumbler) weighted for relevance using a search engines internal algorithms then based on frequency, occurrence, proximity, prominence date/query freshness and dozens of other calculations (implemented on the fly when someone queries the search engine), then sorts the tumbler using metrics of weighting to select the most relevant algorithmic result.

This happens billions of times each day as people search and ping search engines with search queries and it is no secret on how something as simple as this and ranking in the top 3 results for a query can produce a high click through and conversion rate for those who understand this intersection of relevance and appeal.

This relevance (created from a body of documents) translates to raw ranking potential and can be sculpted to mold multiple keyword variations seamlessly through a structured content development strategy.

The end result, your website ranks for multiple variations of primary, secondary and tertiary stemmed key phrases based on the parent theme (topic) all from understanding the goal and filling in the blanks based on keywords, modifiers and key phrases joined by the underpinnings of natural language and natural language processing.

The takeaway is clear, create websites Google will fall in love with (Authority Sites) by keeping your content unique, consistent, frequent and use a cascading effect with shingles (groups of words or topics) to increase the foundation of natural language for your primary groups of keywords.

While there are no guarantees in search, you have to do everything within your power to protect your business against shifts in algorithms that could potentially close your doors.

Playing it safe and using a content development strategy with posts aging daily and gaining Page Rank and rankings to reinforce your position is a long-term stable strategy that continues to stand the test of time.

To summarize the 1700+ words in this post, quality is your ally, embrace it or potentially face the consequences.

Or Stick Around and Read More Posts

9 Comments

  1. DJ Morris
    Posted March 14, 2011 at 7:19 am | Permalink

    It’s amazing what properly placed keyword anchor text on high ranking websites has done to my sites ranking for those keywords without being spammy!

  2. Danny-Home Brewing Supplies
    Posted March 14, 2011 at 7:48 am | Permalink

    Glad you are feeling better.

  3. Chad Nicely
    Posted March 15, 2011 at 11:34 pm | Permalink

    The answer does seem to be longtailed keywords now days. You know the other thing I am noticing. Google not only loves authority sites, but more so Authority Blogs. So your there already!
    Cheers – Chad

  4. James G
    Posted March 16, 2011 at 5:46 am | Permalink

    I’m fully agree with you, google really loves Authority sites and blogs and your review is really very interesting
    Thanks for sharing

  5. Jeffrey Smith
    Posted March 16, 2011 at 11:37 pm | Permalink

    @DJ:

    No doubt, there is nothing like a quality link. You need a few instead of a few hundred.

    @Danny:

    Thanks, feeling much better now.

    @Chad:

    You just caught our best kept secret, perpetual rankings without requiring backlinks (the site can rank for up to 2MM competing pages in phrase match from a title tag and no links), authority works!!!

    @James:

    Glad you enjoyed the read. All the best!

  6. Adrian
    Posted March 17, 2011 at 12:22 am | Permalink

    What would you say for a quality site that vanished in rankings? I have had zero traffic starting around March 10th…. hopefully just shifting around in the “dance”.

  7. Adrian Lee
    Posted March 17, 2011 at 5:49 am | Permalink

    Once you build a website that has more than 200 pages, it’s pretty stable in the search engines and human traffic. I just love building sites that are modeled on authority sites. It takes longer but will also last longer.

  8. Something Sublime
    Posted March 28, 2011 at 4:26 am | Permalink

    I agreed with you, google really loves Authority sites and blogs and your review is really very interesting…

  9. Cullen Power
    Posted August 14, 2012 at 8:07 pm | Permalink

    makes sense to me. just make a good site right! nothing else matters.

2 Trackbacks

  1. By Anonymous on March 13, 2011 at 5:32 pm

    [...] yourself from changes to algorithms by creating an authoritative online presence. Continued at: http://www.seodesignsolutions.com/bl…thority-sites/ If you wish to discuss anything mentioned in this blog post, i'd suggest you quote my reply and [...]

  2. [...] stated above, creating robust authority sites is one tactic that will keep your business safe from shifts in algorithms. The more relevant, [...]

Post a Comment

Your email is never shared. Required fields are marked *

*
*

Web Analytics