Inspired by a complex yet fascinating mathematical document on information retrieval from Adam Berger in 2001 on the subject of machine learning and information retrieval (PDF), I couldn’t resist the need to simplify the first concept he introduced in the preface about word relatedness which is one of the underlying principles of SEO as we know it.
Identifying topical semantic clusters of keywords and integrating them into a themed site structure by creating overlaps through lexical relationships and the connectedness of words, site structure, navigation schema and content using natural language is an advanced SEO tactic.
For the record, just like you don’t have to know how an engine works to drive a car, you don’t need to know information retrieval is compiled to understand search. You should however, know enough to understand the premise of natural language processing.
If NLP and keyword occurrence are executed properly, the crowning achievement of leveraging semantic structures is absolute page and domain authority which can be funneled across the entire array of related keywords through mirroring correlations between the first occurrence of titles, URL structure, navigation and internal linking.
SEO is all about layers, but the base starts with language and language can be broken down into words and their various relationships. The video merely highlights seeing what occurs after the distillation process by showing you the tip of the semantic iceberg using Google’s related search and wonder wheel to peek behind the parametric framework.