in SEO by Jeffrey_Smith

If you closely monitor your SEO rankings, from time to time a ripple enters the SERP’s (search engine result pages) and shakes things up a bit in the index. You need to minimize SERP volatility and assess if it was self-induced, algorithmic, or just your competitors challenging you for your most coveted keyword space.

Rankings Slip? Assess Metrics to Minimize SERP Volatility

Rankings Slip? Assess Metrics to Minimize SERP Volatility

Don’t panic if your search engine rankings fall, fluctuations in rankings are often dubbed the Google dance and among other things, there are a number of reasons why this type of phenomenon can occur.

The main consideration is that search engine algorithms are constantly undergoing revisions to stay ahead of things like SEO automation, abuse from spam and other known variables known for obstructing the search engines ability to parse, assess and grade pages with the highest degree of accuracy.

When this occurs, often many of the variables that your rankings are predicated upon can also get caught within this whirlwind of change and fluctuations may occur for a variety of reasons.

Here are some of the most common occurrences and what you can do to insulate yourself.

The Deep Crawl: Depending on the growth rate of indexed content or competition for specific keywords a surge of information, posts, pages and videos may inundate the SERP’s. As a result, Google will parse this information by often cross referencing this data with data from the past.

We dubbed this in house as the search engine shake down and the outward manifestations of this type of deep crawl (where trust and other variables also play a big role in the vacillations) think of it as a litmus test for the new SERPs against the seasoned SERPs which have been in place for months if not years.

This deep crawl can occur once per month or once every two to three months. When it happens, you may see half of your pages fall out of the index or a larger percentage of content take a dip in rankings. Just consider those pages as on the fence and they need additional metrics or reference from other sources (either from within your website or from external sites) to get the regard they had from search engines.

Typically rankings and results will reappear where they were, but the main thing is to consider what changes were made to a site and if anything drastic was done, to document it and or return back to a previous iteration if the situation does not remedy itself over 7-10 days.

Every so often some pages higher up in the food chain lose relevance or forget that the vertical they belong to is competitive. Hence, the more they rest, the more likely their results are to be overthrown by a hungry competitor who wants to occupy their spot.

Getting too comfortable at the top is a mistake that anyone enjoying a top 10  or top 5 search engine position for a competitive keyword should avoid. Remember, post frequency, peer review and internal on page relevance all play a role in keeping results buoyant, and you cannot simply assume that just because you are ranking there today, that something cannot happen and the rules of relevance can change.

The Weekend Index: Although this is merely refined to our own research, we have discovered that sometimes search engines provide different SERPs (search engine results pages) from Friday to Mid Monday. We speculate they do this to compile data and shake out the flimsy results by running a battery of filters to parse questionable content.

Consider this like a type of amnesia where the data center that crawled your page and the other data centers which may or may not have that data populated within their servers can create a “now you see me now you don’t topsy / turvey” effect on individual pages and their ability to aid other pages ranking in the index.

If page C falls out for some reason and Page C was responsible for Ranking Page B and Page A, then Page B and Page A may take a temporal dip until the daisy chain is repaired and the continuity and link flow restored based on indexation. The main thing is NOT to panic, and look for what could have created less weight for Page C (try getting more deep links to the page to remedy link flow).

Deindexation: I have recently compiled a few posts on the topic of over-optimization and de-indexation and ways to stave this type of penalty off or defer it entirely. This is a matter of duplicate content and duplicity or redundancy of content segments occurring across multiple areas within a website.

As the filters get more intelligent, templates need diversity to produce a unique and distinguishable footprint for their shingles (groups of words) that should correspond directly to their title, url naming conventions, h1 tags and internal links from other pages.

Any variance that lacks relevance or is akin to keyword stuffing or having too many common elements across multiple pages is a demotion just waiting to happen (if someone else has the least imperfect score for that keyword).

Unique content is the cornerstone of search engine positioning, the less you have or the less frequently it is updated, the less likely you are going to be able to defend against a more savvy or more determined competitor who is producing quality content, peer review or acquiring quality inbound links.

Quality Raters: These are people whose primary job is to assess the quality of a webpage and or website and how it correlates to the SERPs they currently thrive in. From time to time people such as this take pen and pad and make assessments which can be added to the repository to filter results (like a slight penalty) to individual pages if the Google guidelines are not being adhered to.

You never know which of the variations above have occurred (changes to a site, server problems, changes to algorithms which devalue your pages, a quality rater or competition). What you may try to find out is, was the penalty associated or confined to one keyword, a group of related keywords or has no particular rhyme or reason.

Introduction of New Ranking or Rating Metrics: Things like blended search, personalization and so many other less publicized changes occur on a regular basis. In 2009 Google had well over 400 algorithm updates pertaining to ranking and re-ranking the SERPs; who is to say that that number of revision will slow down for 2010?

We have only scratched the surface of how or why incidents like loss in search engine position can occur. However it can usually be tracked to a correlating event done from within a website on the admin level as opposed to a mere shift in search engine ranking factors. Or on the contrary, you have to let the results play out a bit, until you can find a trail of data to support your theory and solidify a method to reconstruct the relevance a page or series of pages once had.

If it excessive footer links, too many anchors overlapping with the same keywords, the fact that you changed your title and meta tags? got a bad link or a competitor performed negative SEO on your site? You have to let things settle before jumping to conclusions.

More often than naught, results return and it was just the engineers testing a new formula, but you still need to hold on to your hat and not jump to too many conclusions or start changing random elements in an attempt to unwind the penalty or loss in the SERPs. Stay cool, put your thinking cap on and find the questionable construct or take the opportunity to create new content and provide vibrant internal links to slouching segments of a site (if you want those pages to stay indexed, healthy and buoyant with the ability to pass along ranking credits to other pages). 

Read More Related Posts
Are Top 3 Rankings The New Top 10 for SEO?
We know that top 10 rankings are the objective for SEO and top 5 rankings are even more desirable, but have you seen the new array of top 3 results ...
READ MORE
Are Quality Raters from Google the End of SEO as We Know it?
Are Quality Raters from Google the End of SEO as We Know it? The idea that a manual review of a website or it’s SEO from a subjective human quality ...
READ MORE
SEO Tips for Website Architecture
What is the SEO threshold for a keyword when you need to add multiple layers of related content or amend the website architecture to embrace it as part of the ...
READ MORE
Creating Self Sufficient SEO
What started as a topical SEO post on how your website should build it's own links evolved into a white paper on search engine optimization strategies and optimization tactics. Keep ...
READ MORE
Tiered SEO Navigation and Link Structure
In our previous post called Search Engine Optimization: Creating Tiered Structures, we touched briefly on the importance of site structure and creating cascading or faceted / tiered navigation for search ...
READ MORE
SEO for Large Websites Part III
In SEO for Large Websites Part I & Part II, we discussed the importance of starting with a goal in mind and tying it to competitive keyword thresholds, an overview ...
READ MORE
Puzzled About Landing Page Performance?
The saying you can lead a horse to water, but can’t make them drink is the equivalent of bringing a visitor to a landing page through paid search, banners, contextual ...
READ MORE
Search Behavior and Keyword Selection
Where you rank in search engines matters equally as much as which keywords you rank for; however, you should not confuse rankings, relevance or search behavior when targeting keywords for ...
READ MORE
Web Site Analysis
When using SEO to develop relevance for competitive keywords for your website, just remember that everybody starts at zero. Granted, initiating your campaign with a thorough web site analysis is crucial, ...
READ MORE
Help Search Engines and Visitors Get on the Same Page
Search engine optimization also known by the acronym SEO is comprised of multiple facets. SEO is not a linear process, but rather a holistic evolution involving intricate layers, steps and ...
READ MORE
Are Top 3 Rankings The New Top 10
Google Quality Raters: SEO’s End or New Beginning?
SEO Tips for Website Architecture
Create Self-Sufficient SEO
Tiered SEO Navigation and Link Structure
SEO for Large Websites Part III
Why Most Landing Pages Fail
SEO, Rankings and The Way People Search The
Web Site Analysis
SEO Guidelines

About Jeffrey_Smith

In 2006, Jeffrey Smith founded SEO Design Solutions (An SEO Provider who now develops SEO Software for WordPress).

Jeffrey has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and marketing strategies to increase rankings, revenue and reach.

14 thoughts on “My Rankings Fell: Dealing with a Slip in the SERPs
  1. kev grant says:

    Hi Jeffrey, how’s life treating you?

    still watching that weekend index then? is interesting this weekend, can’t work out whether its our new site coming online massively strongly or G’s weekend antics.

    hope its the first..

    keep well my friend ;)

  2. Hey Kev:

    All is well here… Just noticed a slight dip for one of our most prominent keywords and thought I would write a post about it…

    Nothing like a good challenge eh? Seems the weekend index here is rotating a few new guys through our space and all the while indexation is slightly down a bit.

    Things should bounce back, but I have been a bit lax on the post frequency due to being so busy.

    Hope to catch up soon and all the best…

  3. I think that google is changing the algorithm very frequently, Few days ago I get my blog on SERP rank, but now my blog loss the rank.

  4. machim says:

    I enjoyed the article and thanks for this information .

  5. Dee says:

    I have a site that made changes, and for the most part rankings improved to the interior landing pages.

    The PROBLEM is that the 3 most valuable keywords all dropped significantly because they were all relevant for the homepage and not the landing pages.

    Should I change the title tag and H! back on the homepage or give it more time? Why does Google not see the interior landing pages relevant for these keywords like it does with the other keywords?

  6. Justin says:

    Hi Jeffrey,

    Let’s say a site has had a decrease in rankings because of a quality rater, a bad link or something else. If changes are then made to the page/site to correct the problem, do the algorithms notice the correction (or quality raters return to notice the change) so that the site can rank where it once did? Or is it difficult to undo a penalty?

    Thanks for the helpful post.

    Justin

  7. Christa says:

    Your blog is extremely addictive I’m getting to learn so much i never knew before. When i first launched my blog i started getting some traffic in 2-3 weeks and i was ranking for certain keywords but suddenly one day my rankings went down.I was on the 10-12 page of google search for my keywords.I panicked and thought my site is in Google sandbox but to my surprise i got the ranking back in around 15 days and till then I’m doing quite good.

  8. Writeonbro says:

    You hit the nail on the head! It’s just the G’s way of doing a little dance. Things do always seem to levels off.

  9. Jenn says:

    I realize the Google Dance happens all the time and it has moved my website all over in the SERPS, but, I was ranking on page 5 of Google for a very competative keyword and I unfortunately disappeared to not even being ranked in the top 1000. I am going on the 7th week of my website disappearing. Do you think this is still the Google Dance or something else?

  10. Hard to say, so many variables. If it does not return in 2 weeks, then consider it another type of filter (or perhaps the links linking to you were devalued). If you can’t get that page ranked in your website, you can always boost another page from the same site. Rankings are by the page, and sometimes if a page is penalized, it does not stop other pages from the same site from ranking.

  11. Pranjal says:

    That was real helpful but its not what i was actually looking for!

Comments are closed.