What are the most critical SEO modifications to implement during the preliminary phases of optimizing a new website? Sitemaps and site architecture, links, naming conventions or D) all of the above?
Although a great deal depends on how many pages you have to work with, how old the site is, if the site architecture has any elements worth salvaging, etc. determine which approach is required (such as a band-aid or complete site overhaul) to accomplish the desired ranking objective.
Creating the Keyword/Landing Page Pecking Order
Creating an earmark for specific page/keyword continuity requires a blueprint of how each keyword “fits in” to the planning and execution of the site.
For example, more competitive keywords (keywords with 1 million of more competing pages) should be baked in to the architecture, naming conventions and content as much as possible using an aggregate ranking factor from a collective conglomerate of pages instead of over-optimizing on any one page. This is done through selecting a logical champion page to concentrate your SEO efforts.
Usually using a simple Google search operator is enough to determine the degree of relevance your existing pages have for a specific keyword.
Then based on this relevance and keyword/page saturation/volume, you can then determine if your existing pages will suffice or if you need fresh content to imbue your website with those keywords to cross the ranking/tipping point.
For example by typing the following search operator into a Google search bar;
Note that there is no space between the site: and the website, but there is a space between the site and the keyword.
You can assess which page it deemed the most relevant listed chronologically in sequence “by pecking order” in the Google search index.
Hence, if you see 1 of 10 of 55 results for the “keyword” following the site command, then you know that the website has 54 pages to build internal links from “with that keyword in anchor text” back to the new preferred champion page.
You can also use this technique to uncover how many pages your competitors are dedicating to their optimization efforts to acquire more competitive rankings. Also researching the amount of deep links each page has can also create ranking factor or reveal why each page works as part of a cohesive optimization plan.
For example if they have 30 pages of content and 20 of those pages all have links from other sites in the dozens, that means if they collectively point those pages (which are augmented from off page SEO factors) toward one specific internal page within their own website, the elected page then becomes the new preferred landing page a.k.a champion page for the apex/keyword.
If your current optimization efforts are falling flat, then maybe your website just doesn’t have enough content (relevant pages) on the topic to be considered a candidate for that ranking. This is the most common mistake, attempting to optimize a keyword that rarely exists within the context of a website.
Yes, it can be done, but it is much more difficult to transpose ranking factor from other authoritative sources from outside a website to override the search engines natural tendency to extract term frequency from a website to deem a page relevant.
Implement Consistent Internal Links
Implementing internal links implies using any occurrence of a keyword or selectively rewriting the copy to incorporate specific keywords on critical pages to then link to the preferred landing page.
The point is to consistently link to a preferred page with preferred anchor text (the text in the link) to sculpt the appropriate on page authority within the domain itself.
As a result, any off page SEO factor added to this type of on page optimization (links from other sites) dramatically improves search engine positioning for the page/keyword combination.
Implement Optimal Site Architecture
Implementing optimal site architecture implies removing any off topic page names, folder names or quelling a rampant content management system from using gibberish to represent the pages in your website.
Clearly stated, if you have a page named P=ID287364.aspx vs. my-main-keyword.aspx (using hyphens to separate words in a logical naming convention) then you are losing a valuable ranking factor known as allintitle positioning.
There are three primary metrics that search engines use to assess relevance (in addition to dozens of others) called allintext, allintitle and allinanchor.
Allintext – determines how much authority a given keyword has in the body of text collectively for a website.
Allintitle – determines how much authority a given keyword has based on the occurrences of that keyword in the title of the collective body of documents.
Allinanchor – determines just how much authority the website has (based out of all of the websites in the search engines index) that have links built to them with that keyword.
Use Sitemaps to Increase Crawl Frequency
You should implement sitemaps for newly created areas of your website or within your site as a whole (breaking down each subfolder if you have multiple branches of topics), sub domains or a large website.
The reason is, sitemaps promote discovery from search engines and without your pages being discovered, they cannot contribute any collective ranking factor for your website.
Helping search engines spiders find, devour and pass along your content to the index is the prime directive and the first thing standing between stagnant rankings and increasing your position after on page changes have been implemented.
For more information about sitemaps and how to utilize them this post SEO Techniques to Improve Rankings can elaborate their importance and execution further.
SEO should be approached scientifically understanding that search engines only react to the signals your pages produce and a large part of that signal is on page optimization (which is completely under your control).
Managing expectations and understanding that each stage needs to work in tandem with other critical layers to produce a top ranking (depending on the keywords you target).
So, targeting keywords within your reach is the logical way to increase traffic systemically while building a rapport with search engines which will allow your website (when it is time) to easily topple the more competitive phrases you set your sights on.
It is merely a matter of 1) clear intent within a website through naming conventions, 2) keyword-rich folders, 3) descriptive titles and meta-data, 4) consistent internal links, 5) the use of sitemaps to increase crawl frequency and 6) blending in deep links from other sites to your landing pages that will transform a website from lackluster to blockbuster in search engines.
Each website has its own tipping point, however, by collectively managing the layers alluded to above it is only a matter of time before the baseline threshold improves and a website is able to easily produce rankings for more competitive keywords with a mere fraction of the effort.