in Search Engine Optimization Myths by Jeffrey_Smith

Have you ever strived to reach the pinnacle of a search term and for no reason it just disappeared? This is commonly known as the “Everflux” or “The Google Dance” in SEO. Although it can be a source of stress for many (as their rankings momentarily dip south on hiatus) why does it really happen and how necessary is this function to maintain quality in the Google search index? Here is my take on this.
The Google Weekend Index

It is no secret that Search engine Indexes churn and qualify content and links, but have you ever considered that indexes are often rebuilt (right under your eyes) as other searches are being executed.

Out to Lunch:

Does the search index take a break on Friday to make room for the new entrants for specific keywords? It would appear so. With millions of new sites being created and added to the weekly crawl for retrieval each week across the web, the ebb and flow need to make room for change, yet maintain order through redundancy, in the present tense. Think of it like the analogy of a running backup that feeds into the real time /main index, “like an ocean flowing into a lake” which eventually feeds back into the ocean.

Search engines must constantly crawl and update their index to maintain relevance or the user experience suffers. Without spiders investigating sites and updating the index, we would constantly be presented with stale data.

Case in point, my theory is, until the algorithm shakes out the new entries and cross references those with immunity from the possibility of extinction, there is definitely another round of due diligence each competitive search term must face in an alternative testing grounds to maintain its visibility and longevity for the main search index (until it reaches a state of authority/immunity).

I am open to a more detailed explanation of this algorithmic function if someone elects to share their grasp on this phenomenon such as a search engine engineer that can describe this compiling / decompiling / inclusion or elimination function? I welcome and encourage any feedback on the topic, and how it pertains to the search index.

The reason this post was written about amnesia and threshold redundancy is, after spending countless hours of each day analyzing traffic patterns and search engine optimization results, you start to gain a sense of when you last saw rankings and patterns when they appeared in the search index for different data centers.

I recall a time last year, when the bottom fell out of various search results and it seems that the new ranking algorithm (from what trails it leaves behind) is continually in a state of self absorption, alignment and self adjustment. The bottom line of this function is clear, quality control and relevance.

For example the word company was not valued the same as the word services as a modifier when this occurred. Seeing search results that had hundreds of millions of competing phrases reduced to 1 million results was needless to say alarming to most.

The idea that each word has a unique threshold even though both imply similar connotations shed even more light on how popular search terms have their own unique rules and how search engines interpret and display results.

For example, a keyword used in conjunction to the word company scored lower for competing pages and showed a few hundred thousand search results instead of a few million. Essentially, the index was essentially compressed and rebuilt over the course of weeks. This was also concurrent to universal search being injected and released, so one can only speculate as to the true cause behind such effects.

Recently there was another anomaly, case sensitive search made its debut in Google where a search query using capital letters at the first letter of each word returned one result and the same query using lower case letters produced two distinct search results. Variations of this new algorithm are still running as we speak to a slighter, more manageable extent for searchers to ingest.

I am going out on a limb here, which is why this is categorized in the search engine optimization myths section of our blog. Yes, this is a facet of the Google Dance, but catching it in mid phase I must say was like reading Monday’s headlines on Sunday afternoon (like catching a glimpse of things to come).

Since I am not a computer science major, nor am I a search engine engineer, this is probably another day at the office to them with a succinct description, from the eyes of an observer I need to build my case and elaborate my findings. I was however able to find a very informative piece on this phenomenon and included the link to Markus Sobek’s expert opinion regarding “the Google Dance” which is an absolute must-read post for further elaboration.

So here it is, my hypothesis is that on Friday several of the major data centers in Google punch their ticket for the week and essentially put up a secondary index so they can tally the results of new data gleaned from new sites and assess their quality score.

During the next 48 – 96 hours (according to this theory) the main index runs a battery of tests on pages that rank for competitive phrases (to in essence, cross-references and tests their links and various on and off page factors against other data centers). This could be done from using spiders armed with different algorithms and crawl rates to create a site profile.

Then, after integrating all of the new sites and their information crawled over the week, the real time index combines this data with the older snap shot and then creates a semblance of the previous and current data to pool search results (typically about 48-96 hours). It is almost as if it virtually deconstructs and reassembles itself to see which results (have authority) and remain intact.

The last one for the keyword I was studying lasted 14 days the keyword was “SEO”, then in the middle of the night, the switch flipped (last Sunday June 15th) and on Monday, the choice ranking was gone. Is this a feature where through the process of elimination, only the most relevant results survive? Almost like a trust-rank / relevance bot that potentially passes or flunks each page it grades and determines if the page graduates to a more stable parameter or is purged from the index or into a secondary or supplemental index (in safe keeping until it can prove value).

From my experience, the results return to their position from before the fall if your site emerges from the gauntlet unscathed. It is nearly Wednesday (roughly 70+ hours from the time they disappeared) and the search results are stabilizing for the target phrase “SEO” with an additional 2 Million results that made it (from the inflated surplus of sites targeting the keyword).

To reiterate, this is merely my hypothesis on one facet of the search algorithm, and why would I suspect such a thing, because I took a snapshot when one of such operations was compiling. The term SEO which usually has 255,000,000 competing results, shifted to over 1 billion (see attached photo).
Or just Business as usual in the Google Algorithm?
Is this search engine amnesia and the process of redundancy or rediscovery? Could this be the effect of multiple data centers comparing results? or is it just business as usual in the Google search algorithm? At this juncture it is too early to tell.

This one facet of search (the Google dance on a grand scale with a few new twists) may in fact be the case when search results plummet? Instead of looking at this as a result of loss of relevance, just think of it as a safety precaution of due diligence for Google to maintain the highest quality possible in their index.

If your site makes the grade and you can keep up with the this algorithmic heavyweight, then theoretically your results will return when the integration period stabilizes. According to the study, which is now in progress, one may be able to predict the churn rate of particular keywords and either insulate your site to be less susceptible.

However, just observing the phenomenon is enough to make any SEO nervous as you see your most competitive word take a hiatus for a powder break and go south in the rankings while it is being interrogated for quality.

Is this a survival of the fittest artificial intelligence like inclusion and exclusion feature of search engine algorithm is there to police relevance and is a constant work in progress. Just think of this as agents in the matrix going after NEO based on the Warner bros. movie the Matrix.

Add the fact that each datacenter has a unique snapshot of where you site ranks and how it could spider your site using a wide array of bots to test the structure of your content, links and authority (in an instant). If your site checks out, you are granted back in the search engine VIP club.

This elaborates why “the Google Dance” is just one of many facets that organic SEO specialists have to contend with. I can only speak from my own conclusions; however I have noticed this phenomenon on two occasions within a month and studying the outcome of which terms are affected by it.

This article falls into the category of theoretical SEO and search engine optimization myths, however if the conclusion sheds light ultimately on a particular aspect of the search algorithm, it could potentially provide more adept SEO‘s with the capability to study and share nuances in this regard on how to insulate your pages from the scramble. It is too early to tell, but it is always nice to share a glimpse of something gleaned quote by accident while conducting routine research in ones daily related tasks.

Another thing that appears to have occurred is the order of pages returned as a relevant search result has shifted (instead of page A now page B enters the rankings with a different relevance score). But this is another post in its entirety; stay tuned for more from SEO Design Solutions

Read More Related Posts
Search Engine Mannerisms
When you constantly crawl thousands of search engine result pages daily (just to understand the behavior of cause / effect and the search engine algorithm) you start to notice the ...
READ MORE
I'm sure the last thing on your mind when your conducting a search query is just how many functions are lurking behind the scenes to deliver your search result. What ...
READ MORE
Is it just me, or am I the only one seeing the search engine result pages (SERPs) Yo-Yo effect? The phenomenon is illustrating all kinds of uncharacteristic behavior in Google ...
READ MORE
Search Engine Mannerisms
Search Trends and Weathering the Storm of Fluctuations
Does Google Have A.D.D?

About Jeffrey_Smith

In 2006, Jeffrey Smith founded SEO Design Solutions (An SEO Provider who now develops SEO Software for WordPress).

Jeffrey has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and marketing strategies to increase rankings, revenue and reach.

32 thoughts on “Search Engine Amnesia, Redundancy and Rediscovery
  1. Danny says:

    Man, you said a mouthful. I believe you are correct. I certainly don’t understand what your talking about as well as you do, but I understand in a weird way. Very detailed language, using lots of anchor text with keywords in it. How long did it take to write this Jeffrey?

  2. Hey Danny:

    This post took about 2 hours to make it coherent, it is still part of an overall theory of information architecture, some aspects are spot on, others are a bit dodgy. The real point in layman’s terms is don’t panic if you slip, chances are your rankings will be right back when the search engine index is refreshed.

  3. dude, if i only had half of your brain..sheesh..

    i dont even know where to begin in responding to this but it’s interesting and i’m taking notes.

    maybe they should post an ‘out to lunch’ sign for that 48-96 hrs ey?

    nah, that would mean they’d be letting you know you’re right and they can’t have that..then you’d have an advantage.

    haha, ‘agents going after NEO’

  4. You never know Spostareduro, I “could be” an agent from the matrix sent here to keep you all asleep and under the impression that Google is just a company, when they are in fact an extension of “The Architect”!. Muah, Muah, Muah!

  5. and i thought you were just a brilliant guy sent here to save us. boo on you!

  6. Thanks for giving this post legs on Mixx, Stumble and the rest, It is appreciated Kimberly.

  7. you’re welcome. :-)

  8. Bill says:

    Fun post, Jeffrey.

    There are a lot of potential things going on at the Googleplex that could cause a flux with what we see in search results.

    A recent interview with one of the higher level Googlers revealed that they were averaging at least 9 changes a week to the algorithms that rank pages.

    A new compression and decompression approach to Google’s index seems to have been installed earlier this year that allows for better searching of phrases. With the “case sensitive” results, it’s possible that Google has gotten better at understanding named entities – specific people, places, and things that are unique and have names associated with them.

    It’s quite possible that when you receive search results, the results you see might be a blended set from more than one algorithm, and user data collected from the display of those results might be considered to decide which algorithm works better.

    It’s also possible that different data centers may be used to test and stage the introduction of new algorithms.

    The term “everflux” from 2003 referred to the idea that in some instances Google was introducing some pages into results on an ongoing basis rather than as an update that happened every five weeks or so.

    It’s hard to tell from the outside what’s happening with Google sometimes, but there have been a lot of changes in recent years. Guess that’s part of what makes doing SEO interesting. :)

  9. Thanks Bill:

    I had to categorize it as myth, since your really cannot use the word Google Algorithm and speak from a level other than assumption or at least what we know about it from the breadcrumbs it leaves behind.

    I am still patiently waiting to see the results of this theory when our elect keyword “SEO” returns to the Top 10. Until then, it is my devout purpose to understand and insulate the pages we optimize as much as possible from losing relevance score when this topsy turvey monthly episode from the index makes it presence known by adding strategic link weight here and there to weather the storm. New rankings are particularly exposed to this phenomenon, and since I only had it for 14 days, I hope the memory is enough to make it around the second time when the sift and sort is done churning.

    Thanks for visiting and good to hear your viewpoint as always.

  10. kev grant says:

    Hi Jeffrey.

    I just found this one from Twitter :) ..as others have said on here I couldnt even begin to think about it at the level or detail you’re hypothesising at, but from countless hours watching SERPS at all times of the day & night across hundreds of phrases & sites I absolutely concur that there is a weekend “testing” of new data / index / results.

    have seen it time and time again where sites are showing at higher levels than in the week, or vanish from the results completely at the weekend, only to return to “normal” again late Sunday or early Monday, with the occasional new entrant there too.

    when you see a site temporarily promoted at the weekend about 80% of the time it seems that will be it’s true (weekday) position in 2-3 weeks time.

    it seems to me almost as if they test the new results at weekends for a couple of weeks, before putting them live for the big gig a little further down the line. This is of course a gross over simplification, but I always like to try and simplify as much as possible for my poor little brain :)

    I often read “experts” saying not to watch the Google flux at all, concentrate on traffic / conversions instead, and yes of course, but I also that find watching the results in detail across multiple sites & keywords gives an intrinsic “feel” for the way things are going, and (very) advance signals or warnings even, of reactions to changes made on sites or coming from G themselves.

    am glad am not the only one who does this and has seen this effect, as I sometimes wonder if staring at the “matrix” for long hours trying to make sense of its seeming chaos is also fast shortcut to insanity?

    respect my friend ;)

    kevsta

  11. Hey Kev:

    According to the latest changes I have been tracking, they are combining an effect like the Florida Update mixed with a new de-indexing like effect for sites not measuring up to the new standard. It seems like G needs to make room to keep relevance as par excellence. I had no intention to go too far with this one, but I will be creating a follow up post on this one when I get something more conclusive. The weekend index is most definitely in effect, sometimes it stays stale until Wednesday and then shows the real time results. This may in fact become a thorn in the side of SEO’s who have to get ahead of the curve and augment their pages with authority to weather the fluctuations before they occur.

    I just witnessed our site lose 200 pages in the index and then build them back within an hour. To say that there is an interrogation occurring (a measuring stick if you will) is a fair assessment. No pages, no rankings, makes perfect sense from the standpoint of quality management. Much respect to you as well Kev, All the best.

  12. Danny says:

    I still can’t get over this article.

  13. Sgt. Mark Jennings says:

    When it comes to CAPITAL letters versus lowercase in the search engines, think of it in terms of channeling, or channels.

    Building on the analogy of demonstrating data flow as an ocean flowing into a lake and subsequently lake water eventually returning to the ocean, the issue of what form a symbol is etched onto the search hardware is similar to a river with trillions of tributaries.

    Let’s use the cached copy of a site in Google’s search hardware as the example. Just as you have the link leading straight to the site, the cached copy creates a side-band effect, i.e. no trace of your visit will be left on the site itself and instead you are only visiting Google’s database. So in effect, you still have access to at least some of the information broadcast from the website, but if you click on one of the links in the cached copy, you then are taken to the hardware used to broadcast the site.

    Therefore, capital letters act as channeling devices for a programmer in which the capital letters allows for each individual hash mark to contain 2 possible paths: the lowercase version and the uppercase version. If one travels the lowercase path, one set of results will be returned and vice versa, thereby creating an A-B switching element, or the ability to create an alternate universe, if I may.

    So because lowercase and uppercase letters have different pixel counts and positioning (let alone the additional matrixes of using different fonts!), the Google algorithm alone cannot discern the rhyme and reason behind the publishers use of capital letters versus lowercase text.

    Therefore, on one hand, capitalization does matter when it comes to search hardware and their ranking algorithms. But what purpose such slight and subtle alterations are used for is what white-hat SEO is all about, right?

    Nice article!

  14. I am aware that the search results in Google starting paying more attention to lower vs. upper case letters and displaying unique search results recently as well as title case to lower case queries.

    Bill Slaski from http://www.seobythesea.com did suggest that the search algorithm is becoming more sophisticated to encompass other forms of retrieval to assess contextual data. I know it is not necessarily block segment analysis, but what it does mean is another layer of anchor text revisions for most to gain absolute domination of a phrase in its canonicalized state. Thanks for visiting Sgt.

Comments are closed.