The thing about SEO that remains the same is its always changing. In the past few weeks I have noted everything from local search results popping up without adding and local keyword modifiers (which imply GEO targeted IP mapping), site wide links in normal SERP results under specific listings, tag pages from authority sites skewing competitive search phrases and a number of other UNIQUE patterns in the search engine result pages.
What does this imply? That Google is testing new formats to roll into the next batch of updates. Last year over 400 updates were administered to the SERP recipe alone. During that time it is normal to see a lapse of cached data from weeks or months prior displayed without having a back in 72 hours sign is displayed (just so you can still search and retrieve documents).
All the while, the engineers are testing new patterns in specific data centers to see if they play well with previous settings. If the user experience is tainted, then it is almost as if someone hits a reset button and the results tend to normalize over a few days with some semblance of the past and the present rankings fusing in a more formal capacity.
At this point, it is too early to tell what the hybrid effect will produce, however, if I had to speculate right now, I would say that the subtle relationships with the least common denominator are making their way to the forefront. In essence, a website with a solid base / theme will be able to rank for a broader range of variations without having to build links to gain outward SERP expression. Just think of it as them getting a new microscope that can see deeper into the matrix than ever before.
For example, by analyzing a number of long-tail traffic patterns as of late and taking the fact that Google has just adding a plethora of “more descriptive suggestions” to their related search or suggested searches. The nuances that are predominant at present are all about finesse and the elements of a refined search.
In other words, pages that ranked for very specific arrays using a component of the “term frequency“ aspect of the algorithm now being given a larger broadcast range. In other words, if you expand the parameters of what is considered relevant and then give a less represented term the ability to rank higher as a result of co-occurrence, you would have this effect.
So, now, instead of your main page ranking for something like SEO rates, if you have another page in your website that ranks for SEO consulting and someone type a search in Google for “SEO Consulting Rates”, the cells of the search parameter are extended to include both pages and essentially “borrow certain elements” or “ranking factors” from one page to augment and lend itself to another. So, even if the word consulting only appears sparsely on the rates page, that is enough to be considered a broad match, but specific enough to award the secondary page as sufficient as a relevant search result for singular value decomposition.
With these parameters extending “hypothetically” that would account for all of the anomalies of the search engine result pages. Consider it a matter of sensitivity and with the scope of Google’s ability to parse even further into the top ranking documents, less co-occurrence could toggle a relevant result.
In other words, pages could share their ranking factor making it easier to rank for related queries within the space of your entire site, this could be dubbed “chameleon rank” in theory, but let’s just see what transpires in the next few weeks before stepping out on a limb.
Just consider it a bigger microscope to mix and match keyword DNA for Google in their never ending quest to take semantic search to the next level which borders artificial intelligence. As if its not already hard enough to keep up, whenever they introduce a new twist, its back to the drawing board to look for patterns, similarities amongst search patterns or heuristic conclusions to base an assessment.
Stay tuned for more SEO tips, tactics and techniques from SEO Design Solutions….