If you closely monitor your SEO rankings, from time to time a ripple enters the SERP’s (search engine result pages) and shakes things up a bit in the index. You need to minimize SERP volatility and assess if it was self-induced, algorithmic, or just your competitors challenging you for your most coveted keyword space.
Don’t panic if your search engine rankings fall, fluctuations in rankings are often dubbed the Google dance and among other things, there are a number of reasons why this type of phenomenon can occur.
The main consideration is that search engine algorithms are constantly undergoing revisions to stay ahead of things like SEO automation, abuse from spam and other known variables known for obstructing the search engines ability to parse, assess and grade pages with the highest degree of accuracy.
When this occurs, often many of the variables that your rankings are predicated upon can also get caught within this whirlwind of change and fluctuations may occur for a variety of reasons.
Here are some of the most common occurrences and what you can do to insulate yourself.
The Deep Crawl: Depending on the growth rate of indexed content or competition for specific keywords a surge of information, posts, pages and videos may inundate the SERP’s. As a result, Google will parse this information by often cross referencing this data with data from the past.
We dubbed this in house as the search engine shake down and the outward manifestations of this type of deep crawl (where trust and other variables also play a big role in the vacillations) think of it as a litmus test for the new SERPs against the seasoned SERPs which have been in place for months if not years.
This deep crawl can occur once per month or once every two to three months. When it happens, you may see half of your pages fall out of the index or a larger percentage of content take a dip in rankings. Just consider those pages as on the fence and they need additional metrics or reference from other sources (either from within your website or from external sites) to get the regard they had from search engines.
Typically rankings and results will reappear where they were, but the main thing is to consider what changes were made to a site and if anything drastic was done, to document it and or return back to a previous iteration if the situation does not remedy itself over 7-10 days.
Every so often some pages higher up in the food chain lose relevance or forget that the vertical they belong to is competitive. Hence, the more they rest, the more likely their results are to be overthrown by a hungry competitor who wants to occupy their spot.
Getting too comfortable at the top is a mistake that anyone enjoying a top 10 or top 5 search engine position for a competitive keyword should avoid. Remember, post frequency, peer review and internal on page relevance all play a role in keeping results buoyant, and you cannot simply assume that just because you are ranking there today, that something cannot happen and the rules of relevance can change.
The Weekend Index: Although this is merely refined to our own research, we have discovered that sometimes search engines provide different SERPs (search engine results pages) from Friday to Mid Monday. We speculate they do this to compile data and shake out the flimsy results by running a battery of filters to parse questionable content.
Consider this like a type of amnesia where the data center that crawled your page and the other data centers which may or may not have that data populated within their servers can create a “now you see me now you don’t topsy / turvey” effect on individual pages and their ability to aid other pages ranking in the index.
If page C falls out for some reason and Page C was responsible for Ranking Page B and Page A, then Page B and Page A may take a temporal dip until the daisy chain is repaired and the continuity and link flow restored based on indexation. The main thing is NOT to panic, and look for what could have created less weight for Page C (try getting more deep links to the page to remedy link flow).
Deindexation: I have recently compiled a few posts on the topic of over-optimization and de-indexation and ways to stave this type of penalty off or defer it entirely. This is a matter of duplicate content and duplicity or redundancy of content segments occurring across multiple areas within a website.
As the filters get more intelligent, templates need diversity to produce a unique and distinguishable footprint for their shingles (groups of words) that should correspond directly to their title, url naming conventions, h1 tags and internal links from other pages.
Any variance that lacks relevance or is akin to keyword stuffing or having too many common elements across multiple pages is a demotion just waiting to happen (if someone else has the least imperfect score for that keyword).
Unique content is the cornerstone of search engine positioning, the less you have or the less frequently it is updated, the less likely you are going to be able to defend against a more savvy or more determined competitor who is producing quality content, peer review or acquiring quality inbound links.
Quality Raters: These are people whose primary job is to assess the quality of a webpage and or website and how it correlates to the SERPs they currently thrive in. From time to time people such as this take pen and pad and make assessments which can be added to the repository to filter results (like a slight penalty) to individual pages if the Google guidelines are not being adhered to.
You never know which of the variations above have occurred (changes to a site, server problems, changes to algorithms which devalue your pages, a quality rater or competition). What you may try to find out is, was the penalty associated or confined to one keyword, a group of related keywords or has no particular rhyme or reason.
Introduction of New Ranking or Rating Metrics: Things like blended search, personalization and so many other less publicized changes occur on a regular basis. In 2009 Google had well over 400 algorithm updates pertaining to ranking and re-ranking the SERPs; who is to say that that number of revision will slow down for 2010?
We have only scratched the surface of how or why incidents like loss in search engine position can occur. However it can usually be tracked to a correlating event done from within a website on the admin level as opposed to a mere shift in search engine ranking factors. Or on the contrary, you have to let the results play out a bit, until you can find a trail of data to support your theory and solidify a method to reconstruct the relevance a page or series of pages once had.
If it excessive footer links, too many anchors overlapping with the same keywords, the fact that you changed your title and meta tags? got a bad link or a competitor performed negative SEO on your site? You have to let things settle before jumping to conclusions.
More often than naught, results return and it was just the engineers testing a new formula, but you still need to hold on to your hat and not jump to too many conclusions or start changing random elements in an attempt to unwind the penalty or loss in the SERPs. Stay cool, put your thinking cap on and find the questionable construct or take the opportunity to create new content and provide vibrant internal links to slouching segments of a site (if you want those pages to stay indexed, healthy and buoyant with the ability to pass along ranking credits to other pages).