Many websites are reeling from what feels like a Googlebot sweep and kick to the chops when it comes to their main keywords dipping south in the search engine results.
While the results are not in on what could be causing such violent shifts in the SERPs, there is however some speculation that heightened trust filters and site speed (a.k.a page load time) may have something to do with it. While it is normal within the framework of SEO for websites to show some vacillation, pages churning by the hour for specific keywords have become common place across multiple industries.
What is this New Metric?
Only those on the inside of Google can surely say, however, Google recently announced that site speed is a new metric in the Google Caffeine algorithm and that it would only impact 1% of the current SERPs (search engine result pages) in how they rank websites in their index.
Under normal circumstances 1% would represent an infinitesimal variable hardly worth consideration; however, 1% of Google SERPs still represents millions of keyword queries with business models daily hanging in the balance.
SERP Sensitivity
We know that Google is constantly evolving, in fact, based on the patterns we have been tracking, we think we may know which 1% of websites are affected (the most lucrative keywords that drive traffic to any website). In our summation it appears that a new trust metric is cross-referencing sites and eliminating those that do not have the appropriate credentials of this newly formed (unknown) threshold.
Those who fail to meet the grade are relegated to the tumbler where their site bounces around daily if not hourly in an attempt to gain its bearing in the search engine result pages.
We have also noted an increased sensitivity in Google’s algorithm lately to:
- Duplicate content within a website
- Page / ranking penalization for excessive inbound link velocity
- Site wide links losing relevance in the ranking algorithm and
- Drops in indexation in the new caffeinated index
Going back to the 1% site speed conundrum, which keywords represent the purest form of market dominance for any website?, the keywords that are most likely going to be abused by questionable practices to acquire them (the root phrase).
This is why it makes perfect sense to monitor those root phrases first (from the standpoint of an algorithmic filter) with the highest degree of scrutiny to determine if the pages that occupy those positions are valid.
The keywords with the highest search volume are typically the one word / root vertical keywords in a semantic keyword cluster and while you may not see any significant change for your second tier and third tier phrases, chances are your main keyword has taken a hit if you are caught in this filter.
While it is too early to tell, without conclusive testing under controlled circumstances, it does not take a genius to figure out that something under the hood is raising a red flag for many websites who may have pushed the envelope too far.
If you have also experienced anything like this, feel free to leave a comment below about your experience with the new hypersensitive ranking algorithms. Are these dynamic results a symptoms possibly triggered from a backlash of load time or possibly is it just Google’s index becoming more selective about the type and volume of content it allows in its index for a given website?
Share your thoughts if you have noticed any ranking anomalies as of late.
I just checked my GWT (Google Webmaster Tools) & it says, I have zero pages indexed, on most of my sites.
When I do my own search (site:site_name.com), all my pages are still showing in the SERPs.
Something is going on… ;)
Hola Amigo.
it’s facinating to watch the churn, I mean we know it used to churn, but this is surreal.
havent managed to isolate or conclude much as yet.
you?
kevsta
@SlimJim:
I have seen the same instance of pages falling out of the index on multiple sites…
We lost 3800 pages or so, but only one keyword is affected (the long tail is still strong).
If you ask me, I think someone broke something and got fired and what we see is the mess a bad test left behind.
I am all for under the hood improvements, but Damn, this extended weekend index is killing me.
Will the real Google search results please stand up?
@Kev:
Good to see you, long time amigo…
The deep crawl is typical this time of year, but this is ridiculous.
I have also seen delayed indexation of posts (which usually are live in 7 mins and ranking) as well as a delayed bipolar (now you see me, now you don’t) ghost rankings appear and disappear more than I care to mention.
I would have to say that this is a first for this type of behavior, we may have to put our heads together on this one with some tests to get a grip on it.
Hope all is well in Ibiza, and thanks for the comment in the forum the other day (about the technical prowess and all).
Hi Jeffrey,
Great post and I have definitely been noticing some things happening with clients websites. However I am not sure I agree with the link velocity theory…How can we control who links to us? I recently had a client get some major PR and huge volume of links in a very short period of time, if anything their rankings have increased?
Excellent blog by the way, I read it everyday
yep, seen / seeing all those things :)
its like there’s 2 (sometimes more )sets of results for all queries, and they both oscillate by +/- 5 every time you search.
i’m starting to think is this is likely to be the norm from here on out, like G decided to follow through on “rankings are dead” from an SEO perspective anyway
and if you fancy hooking up somewhere for an in-depth note-swapping at some point I’d be up for it.
and yes thx, all good here, any chance you’re visiting us this year after all?
kev
Thanks for the info Jeffrey. In the past 24 hours, I and many webmasters I know have been seeing XML sitemaps in Google Webmaster Tools go from thousands of indexed URLs to 0. I think Google either changed something yesterday or they are doing some kind of maintenance on their index. Or it could be a bug, like you said. Traffic seems the same, which is good… for now. Any thoughts on why so many peoples’ sitemaps are going to 0 at the same time in 1 day?
Found the issue! Google messed up and they admitted it!
http://webmaster-forum-announcements.blogspot.com/2010/04/known-issue-sitemaps-indexed-url-count.html
Aaron.
Good find, & thanks for posting the link.
I knew my sitemaps are all good, I checked each sitemap in my browser, If I can view my sitemap then Google or anyone else can see the same sitemaps.
I’ve read a bunch of sites/webmasters keep re-submitting Google Sitemaps.
Trigger Happy, LOL! :)
All my SERPs are still looking good…
@Guy:
Thanks for visiting…
I should have been more specific about my generalization on the link velocity. I should have stated “for keywords with the same anchor”, as that’s what I was tracking.
@Aaron:
knew it… someone got fired and threw a bug in there or is on their way out now as a result of letting that one slip :D
@Kev:
I think we have to dub this “The A/B SERP Spit Test” in addition to “The Weekend Index”…
Good one, but yes, I have seen the ridiculous contradictions of Google lacking the ability to “pick one” and stick with it.
:D
Well, all my GWT G-Sitemaps are back on track, like nothing ever happend. ;)
Actually one of my sitemaps (wordpress) is getting better crawls than before G started this big mess…
Hello,
We are experiencing poor rankings for majority of our keywords after enjoying top rankings for many years. Not sure what has went wrong. Can this be a penaly of aggressive link building for specific keywords ? If this is the case why rankings for other keywords has gone down.
After the arrival of Google Caffeine , My google results where changed.
Right now they where included Maps result with the organic page , so my result was going down
I’m glad that I came across your site. I have read a number of your articles and they were all an excellent read and very informative. Thanks :)