in Search Engine Optimization by Jeffrey_Smith

It is no secret to those involved in SEO that Google has recently rolled-out or is testing some newly enhanced features (for usability and design) as well as tweaks to the ranking algorithm as of late.

Google-Changes-SERPs

Google Algorithms Changes and SERP Watching

The first wave felt weeks ago has now been dubbed “The Mayday Update”, you can also refer to the recent White Board Friday by SEOMoz for their .02 cents on this as well. To summarize the Mayday Update, many websites who had top ranking pages have found themselves on page 3, 4 and even 5 in many instances with no significant changes identified as the cause of why such a violent change in search engine result page position are clear.

Other side effects include that the long-tail and mid-tail keywords (phrases with 3 or more words) are passing through more stringent semantic and ranking filters (meaning the array of broad match keywords a page could potentially rank for) have been tightened up or reduced to increase relevance. Many have reported seeing up to 2/3 or their long-tail and mid tail traffic disappear during this transition.

For example, if I had a page that ranked for multiple keyword variations of a semantic theme such as kittens, kitty and cats and a corresponding modifier (like delicious) was present in tandem with such a query like kitty food, cat food and kitten food, those results are more than likely going to return a different set of results (as opposed to the same page) since the apparent PaIR algorithm is set closer to precise semantic sets.

If pages rank for fewer variations, then the broader less descriptive keywords in a query could be devalued as a relevant ranking signal on a signal to noise ratio (resulting in less varied, more specific results).

While in theory a slight change in the node of relevance a semantic phrase has when weighted with other ranking signals could send ripples through the web as queries no longer pack their flashlight-like breadth of scope and are now more laser-like in finding a corresponding set of documents.

It is uncertain if this directly correlates to (a) how data is being crawled (b) if the threshold of relevance is higher or more specific meaning the ranking metrics have been altered (c) if it is a filter of the repository or delivery from the Bin (the place where search results are pulled from) or (d) some facet of all of the above.

What is certain is that for many, their main keywords are in a blender with new competitors taking new positions as a result of the recent SERP (search engine result page) shakedown. Typically when things of this magnitude are rolled-out, tested or tweaks are in progress, it is not uncommon to see multiple data sets of the SERPs (liked cached copies or ghosts in the machine) of the various data centers fliupping back and forth.

When normalization occurs and the new algorithms settle, we may see many of the pages that were churned take their place back in their previous positions. However, in the meantime, the best thing to do is stick to SEO basics and keep producing relevant content, using intelligent internal linking to preferred landing pages using preferred anchor text and acquiring quality inbound links to offset any depreciation your website could be experiencing from the virtual rug being pulled out from the web as we knew it – as indexed by Google. 

Read More Related Posts
SEO Ultimate Module Manager
Planning is just as big a part of SEO as execution. If a flaw is present in either, the website pays the price. Built within the fabric of the SEO ...
READ MORE
How New Content and Post Frequency Impact SEO
Today's topic is based on the importance of refreshing your on page content from time to time to toggle "the fresh content factor" to get a boost in the SERPs ...
READ MORE
Log in to Google Analytics and "View Traffic Sources".
Google Analytics and Google Webmaster Tools are a storehouse of data which can be used for search engine optimization. However, when combined with a few other stellar SEO tools creates ...
READ MORE
Ever wondered how to determine if a search term is a likely target for your SEO? Here are a few tips that will let you test the waters with your ...
READ MORE
There is nothing worse than targeting the wrong keywords using SEO, only to determine that the phrase you rank #1 for was a waste of time. Of the various species of ...
READ MORE
Link Building – Achieving Stable Rankings through Link Velocity
If you are involved in SEO, then you know that aside from content, it's all about the links. Without SEO link building services to hit a wide array of websites, ...
READ MORE
Market intelligence is the defining strategic advantage held by your competitors. Similarly, if you know where to look, what to look for and how to determine what causes proceed the ...
READ MORE
Are there any SEO Secrets?
This post questions the notion about secret search engine optimization methods and SEO secrets.  Are there SEO secrets or is SEO just a process of layering the fundamentals to produce ...
READ MORE
Traffic Sources, Vistors and Conversion
With all of the marketing jargon about sales funnels, sales cycles, message matching and utilizing enticing value propositions to encourage click throughs and conversions its easy to forget what proceeds ...
READ MORE
Applied SEO (In Theory and Application)
Resources are typically sparse when referring to SEO. For that reason, today's post serves as a summary of strategies for those who like to apply SEO rather than just read ...
READ MORE
SEO Tips for Using the SEO Ultimate Competitive
How New Content and Post Frequency Impact SEO
Google SEO: SEM Rush, Xenu, Analytics and Webmaster
How to Evaluate Competitive Keywords for SEO!
Simple SEO Tips for Keyword Research
Link Building – Achieving Stable Rankings through Link
Competitive Market Analysis to Reverse Engineer Competitors
Are There Any Real SEO Secrets?
Alternative Traffic Sources: Diversity and Conversion
Applied SEO (In Theory and Application)

About Jeffrey_Smith

In 2006, Jeffrey Smith founded SEO Design Solutions (An SEO Provider who now develops SEO Software for WordPress).

Jeffrey has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and marketing strategies to increase rankings, revenue and reach.

7 thoughts on “Google Algorithm Changes and SERP Watching
  1. SlimJim says:

    I have a short-tail keyword that I’ve ranked #1 in Google SERPs for over a year, I’m still #1, but I’ve noticed that the total results number keeps bouncing back & forth from 24 million results to 14 million results.

    Each time the total results number bounces, I’m still #1 in the SERPs for that short-tail keyword.

    It’s just kinda odd how the SERPs total results fluctuate back & forth, each day.

  2. Hey SlimJim:

    I think with the vastness of content entering the SERPs daily, one minuscule adjustment to any fraction of the ranking algorithm send the strangest ripples and anomalies.

    At this point, I don’t even think they know the extent of the normalization or variances due to the layers. Guess we just have to stay tuned and hope things bounce back or they finally tap out and hit the reset button (to put things back in perspective).

    If this keeps up, its the best advertising Bing could ever get with their steady / relevant algorithm that keeps gaining market share daily.

    Also, keep in mind, later this year Bing Results will power Yahoo as well, so the eggs will not be in Google’s basket if user satisfaction waivers.

    Not particularly B2C but the real dollars are B2B for people who spend $$$ on PPC. If traffic volume goes down (from a butchered organic algorithm playing flip flop daily), then people may spend more on Yahoo and Bing as well since the ads are cheaper and the audience is growing daily.

    I hope they get it together at the Plex, not just for SEO’s sake, but I have seen some real garbage in the SERPs gaining more weight than it should for a number of searches regardless of industry, keyword or otherwise.

  3. Like SlimJim said I have experienced the same thing but with a really competitive keyword that we’re in the top 3 for (bouncing between 304 million results and 254 million results). Luckily so far we have held stable in our top 3 spot, and I hope that will continue.

    I really have to give Google credit though…Their job seriously is one of the most difficult in the world. They are blazing the trail trying to come up with new ways to combat link spam, unethical SEO, and manipulation and so far they seem to be doing a pretty good job actually.

    I honestly think they still need to give more credit to unique content though. I’ve seen so many great sites with a handful of links but incredible content that don’t get any search engine love. Hopefully Google will continue to update their algorithm this year and give those sites more credit..

    Best of luck all!

  4. No Doubt, its hard to separate the good from the bad using an algorithm only. You can only program intent to a certain extent and you are bound to have exceptions. Your site seems to be grandfathered in, so, not so much of a concern for you, just focus on trusted sources for citation.

    For example – skimming Google trends and then writing content that is recent (so you can get picked up on a news feed, etc.) to acquire some trusted link flow. Even putting out a Press release with a unique angle can help get some traffic and trusted links…

    Aside from that, keep dripping content to reinforce your mid tail and primary keywords and you have nothing to worry about.

    Enjoy!

Comments are closed.