in Search Engine Optimization by Jeffrey_Smith

Today, (with slight hesitation in fear of giving away too much) I am electing to share an effective SEO method which incorporates the use of sitemaps, subdomains and site architecture. As a result, you will have the capacity to develop robust websites with colossal proportions using a coherent site and link architecture to virtually zero in on competitive rankings and long-tail keywords alike.


This involves the use of subfolder / naming conventionsSEO friendly titles, relevant semantic descriptions, pretty urls, subdomains and sitemaps.

By employing this strategy, it is similar to targeting the roots of a tree (the keywords stemming from a topic) to reach the leaves (top 10 rankings) by giving them a value (page) and then implement an internal link / irrigation system capable of producing its own secondary and tertiary ranking factors as a result of link cultivation.

Sitemaps do not have to have a passive (just for crawling) in contention to SEO. In fact, think of a sitemap as a two way street. On one hand, you can use sitemaps to increase crawl frequency and get more pages in a search engine’s index. On another level, you can use sitemaps as a ranking tool designed to “reverse funnel” ranking factors to the pages that need added link weight to hone in on competitive rankings (much like a powerful pipeline).

In order to take this tool which was considered passive and turn it into a very powerful internal link sculpting tool, you only need to apply a few fundamental protocols to implement this tactic.

When you look at a Wikipedia ranking, try looking beyond the topical layer and attempt to observe the infrastructure of why and how it got there. The topical layer (landing page) represents a relevant triangulation of on page relevance (title) with keyword / search term is prominent and first – brief descriptor and site referral loop (Wikipedia, the free encyclopedia) to round off the title tag /naming convention.

In addition, the keyword is also translated into a URL string on a sub domain [] to truly concentrate the ranking factors. The tactful use of Sub domains is one SEO method to (a) expand the exact match domain / url to encroach on a more relevant keyword making a domain more specific to a topic.

There is virtually no limit to this on page SEO tactic as you can essentially expand the focus of any website to broaden the relevance funnel using the sub domain tactic. This means, with the right amount of links and content, you virtually scale the content and links pointing to each page in a website to function as the preferred landing page by consolidating internal and external links. This is known as the threshold or barrier to entry for that keyword, and each keyword has a unique tipping point until it gains momentum and ranking power.

An example of how Wikipedia employs site architecture for optimal SEO value is:

  • as the base – which will require a sufficient amount of links to stem.
  • (the wiki folder is where the magic happens).
  • Topic1 Keyword becomes first shingle in the title tag.
  • Topic1 Keyword becomes H1 header tag to emphasize relevance.
  • Topic1 anchor text from other pages all link to

Yet, there is a hidden layer of SEO (the wiki folder) that most do not witness that is responsible for the prominent rankings based on site architecture the site produces.

What I am referring to is the other pages in the subfolder and non indexed pages responsible for shifting ranking factors  that allow the webmaster to add one more layer of relevance by controlling the anchor text that feeds the main silo/ subfolder or landing page.

Naturally this can be implemented on semantics alone or a simple PHP script will suffice to concentrate ranking factors in your websites content management system. The only thing you need to maintain buoyancy for hundreds or thousands of pages is a pipeline capable of shifting link weight from page to page, in this instance the subfolders within the subdomains become the preferred landing page.

In this instance, using the sub domain (as an example) for English provides them with the ability to funnel ranking factors from page to page, yet still keep the english from the spanish version, and so on and so fourth.

In the past, the downside of this strategy is that each sub domain is considered its own site. Now, this becomes an asset as you can essentially determine how you feed your pages and subsections (all based on a keyword) from one to the next. Also, which type of anchor text you use to feed the specific landing pages will determine how they fare in the search engine result pages.

For example, by using custom sitemaps (based on semantic clusters) you can funnel specific anchor text to specific pages to elevate prominence and relevance.  For example, all pages corresponding to a particular keyword could be fed with a second / alternative modifier or qualifying term to promote keyword stemming.

The keyword site operator can provide ideas for semantically themed pages that correspond to a virtual site architecture within a website.

Once you have a list of semantically coherent pages (based on keyword research) you can then nurture them in one place to implement the primary point of convergence (the sitemap) or hub page.

By using robots.txt or the noindex, follow meta tag, you can use sitemaps and landing pages designed to group clusters of concepts, keywords, other landing pages or subjects in one central place where you can feed multiple pages from one entry point.

Through managing the supporting pages (which all link up to the top level landing page to transfer their authority) you can sculpt up to 70% of the ranking factors for any given keyword. As a result, the transcendental ranking factors begin to spill over and strengthen the domain they are hosted on (which in turn feeds more pages which rank higher, etc.).

Eventually you have dozens, hundreds or thousands of pages in a site that all have page rank or the ability to pass ranking factors from one page to the next. By their very nature (the individual pages) they are optimized from the onset and when combined represents a ranking juggernaut as each page develops trust rank and authority.

The aggregate ranking factors for each page begin to stem and expand (which means it can be found for any two or three word combination’s on that page) if a related search query is executed in search engines.

What you have at that point is a website capable of ranking for multiple keywords simultaneously and showcasing the tip of the iceberg (the ideal landing page) built specifically as a consolidation of the keyword / topic capable of ranking on a fraction of the links required by a website that does not employ superior / coherent site architecture.

To summarize, internal links fueled by external links to one concentrated point and then augmented by deep links to the top level landing page have the ability to rank on fumes compared to a website employing that employs less efficient site architecture.

Which means that (a) the more topical information you have on a topic the better (b) that you can elect which pages are SEO savvy and appear (as a result of internal linking) and (c) there is virtually no limit to the size of reach of the websites semantic theme.

Obviously the more concentrated it is, the better (as it will require less links to cross the tipping point). You must understand that content and or external links both produce ranking factors, so it is possible to a website such as this to produce 60-80% of its own ranking factor by default (with more pages gaining strength, page rank and stemming daily).

So, the takeaway for the tactic is:

1)      Group landing pages or themes virtually by using a sitemaps to stand in as pipelines to funnel link flow.

2)      Build it properly from the onset, or consider mapping out a more conducive site architecture and 301 redirecting legacy pages to the new themed and silo’d pages.

3)      Group primary keywords along with secondary supporting pages in miniature sitemaps to concentrate on a core group of keyphrase shingles.

Taking point 3 from above, you could take keywords like; SEO consulting, SEO consultant, SEO consulting services and feed them by a virtual sitemap linking them together (regardless of their location in the site architecture).

However, if the pages were in a subfolder (consulting) within a subdomain ( for example and used internal links which all point to the sitemap and the sitemap to them, each page would share a percentage of total link flow for that topical range of key phrases and modifiers to essentially exceed the threshold of relevance on all layers.

Then add deep links (links to each respective landing page) and you have the ability to catapult them all to the top of their respective shingles using a fraction of the links your competition is using.

And, how do we know this you might ask? Let’s just say we have done this before “with stellar results”…the next layer would be to implement a series of RSS feeds based on the same type of infrastructure to publish related content whenever new information was added, linked to or layered to promote buoyancy for laser-focused keywords and key phrases, yet that is another post in itself… 

Read More Related Posts
Search engine optimization (SEO) or Pay per click (PPC)? and which one is right for me? This is a question that thousands of business owners and marketing managers ask themselves ...
Rich-Media and Moving Past Traditional Marketing Mediums
Have you ever had a day where you wanted to be interrupted and watch commercials or spend countless moments being diffused from your purpose? Didn't think so, this is precisely ...
We Know, It’s Not Just About SEO
Just like a healthy diet, having a balanced array if nutritional variety serves the overall goal (which is to create holistic health for all functions of the body). Much in ...
First I start with a Broad Match Qualifier Search
After you have found a niche and want to determine who the Super Affiliates are, there are a number of extremely simple SEO techniques you can use to find out ...
The Future of SEO
Since you can't peek at your competitors analytics to glean a distinct competitive advantage, you never really know how many ambient rankings they currently enjoy. The thing to consider about ...
Below is a sneak peak of the keyword decisions screen for Domain Web Studio 2.5 prior to the DWS 3.0 update. For the past few weeks I've been working with Matt ...
Strategies from the SEO Playbook
While working on an internal document dubbed "The SEO Playbook" the question of which strategy is better, sub domains or sub folders?, flat site architecture or themed and siloed content? ...
The Difference Between Keyword and Market Research
Sue Bell (CEO of Themezoom) shares powerful insights on how to value a keyword (based on thematic clusters) and how paid and/or organic search values are translated using the Krakken ...
Search Engine Strategies That Work
Properly assessing the competitive landscape from competitive blind spots (prior to entering an online market) can save you months of frustration. You should select keywords within your reach and create ...
SEO Tips to Step Over Competitors
Contained in this post are practical SEO Tips to step over complacent competitors by targeting the least common denominator with SEO. This, in turn, churns lackluster contenders off the page ...
Advertising and Advertisers: Meet SEO and Rich Media
We Know, It’s Not Just About SEO
SEO Tips to Find Affiliate Marketing Opportunities
The Future of SEO
DWS 2.5 Update: Introducing the New Keyword Decisions
Strategies from the SEO Playbook
The Difference Between Keyword and Market Research
Search Engine Strategies That Work
SEO Tips to Step Over Competitors

About Jeffrey_Smith

In 2006, Jeffrey Smith founded SEO Design Solutions (An SEO Provider who now develops SEO Software for WordPress).

Jeffrey has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and marketing strategies to increase rankings, revenue and reach.

31 thoughts on “SEO, Subdomains, Site Architecture and Sitemaps
  1. This tactic was famously used to pollute Googles index with sub domain spam years ago iirc.

  2. Sure thing, wasn’t implying to use it to stray off topic and spam irrelevant themes, but more of a sculpting tool to implement if you needed the juice.

    Thanks for dropping in Danger…

  3. One last thing Danger, the real value in this technique is how you use sitemaps to funnel link flow back into key landing pages.

    I have had sitemaps outranking landing pages from the on page coherence of the semantic theme.

    A sitewide link to a sitemap with a noindex, follow will do to build dynamic ranking factors across the board.


  4. Indeed, didnt mean to imply that the technique was purely spam orientated. Clearly good spam has it’s roots in good SEO ;-) There are lessons to be learnt all over the board.

    One quick note, just in case some of your readers are fairly new to the game – in this instance we’re talking about HTML sitemaps rather than those that use the XML protocol.

    With regard to noindex’ing and following said sitemaps, have you noticed anything in your research to suggest that those pages that are noindexed carry less worth? This could be a logical assumption on Googles part.

    Great post as usual :)

  5. I have not found that they pass less value. I would recommend that you allow 30-45 days for this effect to commence in full effect. The reason for the noindex, follow is, without it, your sitemaps steals the show and ranks on its own.

    The more concentrated the cluster the better, and naturally, cap it at 40-50 links to avoid theme dilution or cannibalization of the effect.

    As you know, the only way to determine what the implications are is to test, test and test it again. When I used this tactic, it was very effective, granted the internal links were meticulously mapped out as well.

  6. Do what Jeffrey says folks. See where he is ranked in Google for “SEO”?

    I’m signing my wife up Jeffrey to your blog, she needs the vibe and as most wifes go, she won’t listen to me!! lol..


  7. SEO India says:

    Great post! Thanks for giving such a useful information…

  8. Using semantic silo’s with the close topic sharing will give good rankings.

  9. Good tips! I love thinking semantic, this is the next generation in search.

  10. Andy Hall says:

    I know I’m alittle late to the party (almost fashionably) but I’d like to clarify something RE: these HTML sitemaps.

    Lets say I have the following structures:


    Would I use 2 seperate sitemaps like so:

    And I take it each sitemap only links to the pages within it’s own “directory” (so to speak).

    Would the sitemaps link to each other too?

    And would it work as well if I didn’t bother with subdomains?

    Cheers, hope I can resurect this incredibly interesting post

  11. Marcus says:

    What worries me Jeffrey, is that you are now nowhere in the top 150 for “seo”

    Has something changed since 2009 when this article was written?

  12. @Marcus:

    We have had no desire to rank for that since then. We did however link to our homepage from the plugin with that keyword (and despite it being no follow) Google did not respond well to an influx of 350,000 inbound links to one page with the same anchor – and since that time, our homepage never ranked for that keyword.

    Now, could we fix that, (yes) and rank another page, but honestly, it is not that important (other keywords were not affected) and we are essentially exiting the client model anyway, so, no need to hit page one for such a competitive keyword if we have no need of offering a service.

    Now SEO Consulting Services on the other hand, has been #1 for years… and makes more sense to us. So, I never really looked back and put more emphasis on clients rather than our site.

  13. Love this post and everything else I’ve read the last few hours on your site.

    I have a few questions:

    What plugin do you use for your html sitemaps and do you link between the various html sitemaps?

    Is it possible to produce multiple XML sitemaps and would that be desirable?

    Thanks in advance for the great work!
    Michael Florin

  14. Extremely decent article. I became aware of your site and wished to mention that I have really enjoyed checking out your blog.

Comments are closed.