Google is not making SEO “an easy endeavor” for search engine optimization professionals; who, aside from competition have to wade through the waves, ripples and tides of Google’s increasingly stringent filters and potential algorithmic penalties.
Such pitfalls (vacillations in search engine rankings) are to be expected, as SEO is based on constantly changing dependent layers and metrics purposefully hidden from plain view.
With one change to their algorithm, websites dependent on traffic from Google could potentially go out of business if they are too vested in that traffic source or fail to have a backup or contingency plan (if their rankings ever disappeared).
Why Rules Exist
While guidelines are essential (to level the playing field), the interpretation of those guidelines or how far individuals or companies will go to bend them creates an online arena similar to the Wild West where the faster, nimble competitors or those with bigger guns reap the spoils of conquest.
The goal is to rank in the top 3 spots (known as the traffic band) for the most of the coveted industry-specific keywords in your primary market (with one or more websites). This strategy allows the victor to funnel conversions to landing pages of their choice.
Google is a medium, a conduit seeking transparency “like a glass than can contain any liquid”, which makes it dependent on that liquid to define itself. Yet, despite their best intentions, making people compete for pole position has an undeniable undercurrent which breeds competition (everyone wants to be #1).
That competition brings out the competitor in those who seek the number one position for relevant keywords. In a way, it appears that the strategy of keeping people preoccupied and fighting amongst themselves (for rankings) as they decide how to grade content they borrowed from the web and serve it back to the masses.
Determining which ranking metrics are prominent in the present requires two things, (1) an understanding of the granular layers that sculpt relevance, and (2) past performance or proof of concept (as a basis of statistical confidence).
The Core Metric of SEO
In a world of cause and effect, it’s better to focus on the core or nucleus, the veritable hub that spokes revolve around. For SEO, that core is “language”. Language is the predecessor of all algorithms that search engines use to assess relevance.
Hence, if you master the nodes of relevance with language and create a broad enough semantic set of related keywords (on both the document and global level of the website), something unique occurs – page level and domain authority.
Creating authority or authority sites is a timeless SEO tactic that translates into higher rankings for both the page level and domain level of the website. Keep in mind that the primary function of information retrieval “as vast as it is” only represents the creation or structure of relevance as determined by language or the various relationships of words in their index.
The idea is to deliberately create authority sites or authoritative clusters of language-specific terminology (by implementing hooks for natural language processing) within a website that represents the scope or terrain of the keywords aggregate collective context.
This in turn allows each page to rise above other less relevant page level or global level competitors who fail to grasp the power of correlation, co-occurrence, relevant cohesion or the importance of semantic structure within a website.
The second metric is based upon practicality, citation and pull/popularity. Essentially, how important is this document to others and how is it linked to, promoted or served? Between the on page and off page metrics, i.e. topical on page relevance and links/votes/citations from other websites – calculates a relevance score.
That score determines where pages rank by assigning a value of aggregate ranking factors to create an ideal match for users based on intent or the keywords used in a search.
The future of SEO and the potential stranglehold search engine algorithms possess to phase out promotional tactics such as link building, social media exposure and citation are all secondary to the primary metric of language as a means to sculpt relevance which in turn determines rankings.
What makes search predictable?
While keywords are hit or miss the intention behind them remains intact. It is this intention that you must optimize for. Keywords are a bridge between your content and those seeking to find it. What matters is that you understand that there are 100 ways to say the same thing, meaning, you should optimize relevant clusters of keywords vs. trying to zero in on a few.
By optimizing your website for multiple clusters of keywords (based on a theme) you hedge your SEO efforts. You can plan for total web domination and get a trickle of traffic from the keywords you anticipate or on the contrary be pleasantly surprised when an unforeseen keyword flies above the radar and starts delivering traffic that converts. Yet, beyond these fluctuations, try anchoring your landing pages or content to the needs of your prospective audience.
There is a fine line between:
- emotion
- problem solving
- genuine solutions
- perceived value
- and timing
While many of these variables are conditional metrics, i.e. based on where your website is at the time someone searches, their mood, what prompted them to search (or spin out on a click tangent) and the mindset they are in when presented with your offer all determine what happens next (based on whether or not the information produced delivers what the query intended).
So, what’s next?
Past performance does not indicate future success or guarantee the status quo or present conditions will not change or evolve. However, the prime directive for search engine algorithms remains primarily intact – to serve the best document for each query.
A search is merely a query to a database/repository to find the most relevant page that exhibits the characteristics of the query. As this will remain a constant, this and the hinge point about language as the track that the search engine train rides on takes us back to the prime directive of search engines – “to find the most relevant or popular document housing the subject of the query”.
Remember this and focus your SEO efforts on quality content “using language as a base” to captivate search engines and users alike. This layered theming and siloing of site structure, content and off page optimization transcends traditional SEO and required immense planning and depth to understand the premise, concept or execution of crafting hundreds to thousands of pages based on semantic values then internally linking them appropriately to augment a common goal.
As stated above, creating robust authority sites is one tactic that will keep your business safe from shifts in algorithms. The more relevant, expertly written content your website has, the stronger its defense against major or minor algorithm shifts.
I love how this post alone is a road map of keyword links to relevant content in other areas :).
@DJ:
We always have to drop the deep links to other morsels for those that would follow the trail.
Thanks for visiting and commenting…
Very nice and informative blog..
Please tell me anyone “What is Google’s panda and its algorithm “
Conduct a search, there is enough chatter on the subject now. But, essentially, sites with low quality got chop-blocked from the index to make room for authority sites…
Very nice and informative blog..
I like your sEO tips but I often wonder if building your silo can be more simple or is it a complex process.
@Nigel:
It’s simple after you do it a few times. It’s really more of a holistic practice, you start from research, then build the silos and supporting content, then build your internal linking requirements based on content development and eventually start dripping backlinks.
Thanks for reading and commenting.
Jeffrey