In all seriousness with all humor aside, to say that the Google SERPs (search engine result pages) pale in comparison to their former stature of stability for those who were comfortable at the top of the top 10 for their most prized keywords is an understatement.
If you are involved in SEO or have watched the search engine result pages religiously for the past 13 years, then you already know that (1) either something is brewing that will keep many on the edge of their seat or (2) something broke and someone (who was in charge of tweaking the algo) is now looking for a new job.
If I had to speculate, I would allude that the “term weights” in the current index are assigning a sporadic array of relevance thresholds (nearly by the hour) to pages from the 2nd and 3rd page (almost as if to test them of see what happens when the top ranking results are removed) and then for some odd reason dispelling them as quickly as they appeared. Other changes were briefly outlined in this post.
Although only a handful of Googler’s really know what is going on, for the rest of us, we just have to wait and see what settles as the search results calm down and return to some form of consistency.
If you have seen vast fluctuations such as these before (the Google Dance), you could easily assign this characteristic of SERP anomalies to either (1) a deep crawl to reindex the existing sites (much like a 20 point inspection / shakedown) or (2) penalties in the link graph are creating ripples to offset a new integer or new algorithm.
For those who witnessed the major overhaul of the search algorithm with the Florida update, this latest snapshot of the SERPs is too close to home for comfort. Websites that were steadily ranking in the top 10 results for eons have now taken a hiatus to parts unknown and new website/entrants have been given a new opportunity to shine and upstage their vertically challenged opponents.
There is nothing like taking a bow gracefully, but to see such vast disparities in data centers and fluctuations of this magnitude will have the chat rooms and forums full of speculation in an attempt to make sense of the new shift in the relevance model.
In case you are not affected, for the past few weeks Google has been displaying a parody of relevance for a number of root phrases and keywords. While many less significant keywords, nothing has changed. On the contrary a boost in relevance or traffic for secondary keywords should be expected; however for most the trophy phrases were a definition of status and niche domination.
Now with those search results removed or “pending further manual or algorithmic review”, you have two options (1) pull your hair out in an attempt to second guess some of the brightest information retrieval experts on the planet or (2) looks for signs, patterns and methods to stabilize primary root phrases without panicking in despair.
For example, Opencube.com is now showing up in the #6 position for SEO (at least at the time of this post). Keep in mind that this is a phrase that only occurs in 8 instances in their entire website. Perhaps I am missing something here, nothing against their company (hey, we even use their DHTML menu’s as well) but I beg to differ about how relevant a result their website it, since they only have the keyword essentially on one page and sparsely mentioned throughout the rest of their content.
Is that the most relevant result for someone searching for information on the topic of SEO or seeking SEO services? I would hardly think so… Many other long term search results have also been displaced, for how long, we don’t really know.
Sure, there is nothing like a good challenge for SEO’s, but SERP (search engine result page) fluctuations of this magnitude leave many involved in search engine optimization ready to throw in the towel and fold up shop.
While the elite SEO’s either wait things out (before jumping in with both feet) or are armed with tools and tactics in hand in an attempt to reverse engineer the model and apply counter measures; the update / anomaly only appears to be affecting root level phrases at present.
The only thing you can do is (a) look for stability (b) try to assess which factors and metrics still remain relevant (c) remove obstructions or track your recent changes and (d) don’t panic because it could all roll back in the morning as if nothing occurred.
At present, I opt for (d) until more information suggests that this is something more than a relevance test just to see what happens when variables x, y and z are provided with different defaults for ascertaining authority, relevance or which position is most suitable for each pages respective relevance score.
If you ask me, this is just a deep crawl where it appears something broke and they are going back to a previous snapshot to rebuilt the index. This is not uncommon and in most instances is a good thing, although the rapid fluctuations are enough to make anyone question every action proceeding the loss in rankings.
If you must panic, here are some resources you could use to see that (a) its not the first time and (b) Google knows what they are doing.
Despite the fact that from one perspective it seems odd to purge results on such a grand scale, relevance exists at a price, and that price means that from time to time things as we know them could change.
In closing, you may with to read
Search Engine Amnesia, and the Process of Rediscovery ( a previous rant on the topic of the Everflux)…and Cache Relapse: Did Google Rebuild the Index Right Under our Nose? (another example of a deep crawl wreaking havoc on the SERPs).
Sit tight, if all goes well, someone will come up with a cute name for the update and more information will be provided on the topic, or it will just resolve itself…
In most instances, things usually revert back to basics in 72 hours as the link graph and algorithmic term weights work out the kinks in real time. During the interim, try not to shift your site around too much or make abrupt changes as you could make it unrecognizable when the crawlers make their way back through to validate their tests.
Trust plays a large role in SERP stability, so, the last thing you want to do is undergo a facelift of invasive site architecture makeover, if it was just an algorithmic hiccup which is often the case of a new sub routines beta test.