Many websites are reeling from what feels like a Googlebot sweep and kick to the chops when it comes to their main keywords dipping south in the search engine results.
While the results are not in on what could be causing such violent shifts in the SERPs, there is however some speculation that heightened trust filters and site speed (a.k.a page load time) may have something to do with it. While it is normal within the framework of SEO for websites to show some vacillation, pages churning by the hour for specific keywords have become common place across multiple industries.
What is this New Metric?
Only those on the inside of Google can surely say, however, Google recently announced that site speed is a new metric in the Google Caffeine algorithm and that it would only impact 1% of the current SERPs (search engine result pages) in how they rank websites in their index.
Under normal circumstances 1% would represent an infinitesimal variable hardly worth consideration; however, 1% of Google SERPs still represents millions of keyword queries with business models daily hanging in the balance.
We know that Google is constantly evolving, in fact, based on the patterns we have been tracking, we think we may know which 1% of websites are affected (the most lucrative keywords that drive traffic to any website). In our summation it appears that a new trust metric is cross-referencing sites and eliminating those that do not have the appropriate credentials of this newly formed (unknown) threshold.
Those who fail to meet the grade are relegated to the tumbler where their site bounces around daily if not hourly in an attempt to gain its bearing in the search engine result pages.
We have also noted an increased sensitivity in Google’s algorithm lately to:
- Duplicate content within a website
- Page / ranking penalization for excessive inbound link velocity
- Site wide links losing relevance in the ranking algorithm and
- Drops in indexation in the new caffeinated index
Going back to the 1% site speed conundrum, which keywords represent the purest form of market dominance for any website?, the keywords that are most likely going to be abused by questionable practices to acquire them (the root phrase).
This is why it makes perfect sense to monitor those root phrases first (from the standpoint of an algorithmic filter) with the highest degree of scrutiny to determine if the pages that occupy those positions are valid.
The keywords with the highest search volume are typically the one word / root vertical keywords in a semantic keyword cluster and while you may not see any significant change for your second tier and third tier phrases, chances are your main keyword has taken a hit if you are caught in this filter.
While it is too early to tell, without conclusive testing under controlled circumstances, it does not take a genius to figure out that something under the hood is raising a red flag for many websites who may have pushed the envelope too far.
If you have also experienced anything like this, feel free to leave a comment below about your experience with the new hypersensitive ranking algorithms. Are these dynamic results a symptoms possibly triggered from a backlash of load time or possibly is it just Google’s index becoming more selective about the type and volume of content it allows in its index for a given website?
Share your thoughts if you have noticed any ranking anomalies as of late.