Ken Lyons from metarocketseo.com recently raised a valid point with a comment on our post SEO and Internal linking about how we arrived at the number of words required to effectively rank a web page using SEO.
The comment was, “You say: “Obviously, the more content you have the better (250-300 unique words per page are ideal)…” “I’m curious as to how you arrived at this word count per page as ideal? I’ve ranked pages with 60 words and pages with 600.
To me, its seems to be more about what your competition is doing in terms of content volume and less about ideal word count. I’d love to hear your thoughts.”
Ken great points, as there are multiple SEO ranking factors, one of the more prominent ranking factors is, how many words does a page need to acquire a competitive ranking using organic SEO?
I wanted to address the reply from a few unique standpoints. The first impression your page makes in the index (when search engines discover your page) is important. Also, what happens every time you edit that page either raises your relevance score or lowers it based on the reputation that page has.
Making a strong first impression can set the tone of how that document is treated, for example, if there are enough instances for a key term, then you have more leverage to wien that term later and supplement it with alternative tactics (off page reputation via links, internal links, etc.).
It used to be possible to rank pages with no relationship to the keywords (like a blank page or a page with a few sentences), however Google has corrected this portion of their algorithm.
I typically suggest 500-750 words on a page for clients (for their preferred landing pages) or at least 250-300 unique words to set the tone from the keywords and co-occurrence they produce. The idea is to use related synonyms and supporting keywords (based on a theme) for targeting the keyword with that page (in the title, h1, in the first 25 words on the page, etc). Why you might ask? Two reasons, on page and off page reputation.
You can always go back and edit the content to ensure it serves its purpose as a landing page, however, in order to capture the maximum number of phrases that can funnel visitors back to that page (from a variety of keywords) you need to make changes over time.
For consideration you can think of it in terms of.
1) Contextual reference
2) Topical relevance
3) Archive Value
When talking about what is appropriate as far as word count when determining the tipping point for a page, aside from competition, your own page needs to have some type of affinity and relevance with the search terms you wish to be found for. The amount and type of content (including word length) are your first opportunities to sculpt this effect with search engines.
When a ranking is produced in a Google search, it is not just the current factors (existing links, existing content) that are contributes to the way that page ranks. Various iterations of the page have all left impressions on the index and those impressions are then used to categorize the range of keywords that are considered relevant for the search terms that embody the context of the document.
Magnify this by the fact that ever since your page (or pages) has been online (or indexed in search engines) every revision has been saved and cached and stored in a data cloud over time.
This repository of edits, content and links are all calculated into relevance score and the amount of authority your page has for the said keyword combination. However with one twist, search patterns can be augmented by content that no longer exists on the page in present tense.
Just because the content you see on a page now doesn’t mean that those components are responsible for its present ranking. So, you see, this is where perception can work against you if you are simply topically skimming a web page and conducting a competitive analysis.
Sure, you know what the page is now, but how many cycles of modifications, edits and unique shingles exist on the page now, that are really responsible for its aggregate ranking factor?
So, the value of creating context and ranking for lesser terms or low hanging fruit is two-fold. One, to create the reputation, to get that data in the cloud (across multiple data centers all showing unique snapshots of the content) and two, to eliminate the need for having it on the page later.
When you consider that with this one SEO technique alone, you can essentially replace PPC advertising if you understand the depth and implications It contains. Once you rank for a specific keyword and achieve the velocity to get a page a high relevance score, that page will have a relationship with that keyword that makes it easier to create a synergy with later.
So, you can add layers of optimization to a page, then after it is indexed for a set period of time, go back and make modifications to the page to optimize it for visitors and push the search engine optimization factors to more concentrated components (like a meta description, tag or title) which is enough to sustain the on page relevance.
For example, our website was optimized for the keyword SEO Visibility for nearly a year, the keyword did not yield fruitful results, so we decided to remove the references from our home page. Guess what? the page still ranks for that keyword (depending on which data center feeds the results) despite the fact that the word visibility has been completely removed from the page. However, since the word SEO is still on the page with enough frequency, the algorithm uses that as the new anchor to produce the ranking.
The implication is, you can optimize a page for a term, then change the page over time (once you hit the top 10) to make it the ideal landing page for conversion without the long 1000 word thesis /encyclopedia summary attached. Supporting pages can carry the weight of relevance as well, which means you can prop the page up with as many internal pages or combination of internal pages and external links as it need to remain relevant and buoyant.
On Page Edits / The Application
So, say you start with a 1000 word page in the beginning, you let the page get indexed (with one series of titles, meta data and prominent keywords) then after it is indexed for a period of time, you go back and make revisions.
Each time you make a revision (adding a few keywords, changing the order of prominence, altering meta or h1 tags, etc.), let the page get indexed and then make a snapshot of the backup (just in case you need to conduct a keyword density analysis later to keep the ratios for specific keywords relative).
Document size can be scaled while still maintaining the same keyword density and concentration of topical relevance. If you remove extraneous words, then make sure the gist remains intact, if you don’t need a word on your landing page for example, then don’t use it. Sculpting your content and message takes time, but each time you make a revision, you page gets the benefit of what it was and what it is now.
Even one word found anywhere on a page is enough to allow that page to still rank for multiple variations of keywords (even If they are not on the page now). The same applies to links (one present, they leave a mark) for both internal and external links (since they are algorithmically now fused with your document based on the ratio of link flow that got transferred as a result of the transaction), yet that is another post in its entirety.
Under this premise, you can specifically observe the cycles of indexing (how frequently a page gets picked up and re-indexed). If you want, you can grab a shingle (5 -10 words in exact match) from that page and add it to Google Blogsearch and then, when the page gets indexed, you are notified via email.
To reiterate, just take a unique sentence (from something you just changed) add it to the search “in quotes” and then let Google alert you when it crawls the page.
Also realize that if you target attainable phrases with a specific page and that information is no longer on the page, you can target additional phrases and the combination between the old and the new will allow your fresh content to rank higher (based on the ghost/data cloud memory of the page).
An Exercise for Mapping out Co-occurrence for Keywords
Start with 10 instances of a keyword on the page, then reduce it down to 5 times on the page, then reduce it down to 2 occurrences while shifting more emphasis to the meta title and description for the key phrases.
Starting with one page, like most do, then adding more pages on the content (but controlling the flow of links). For example, if you have 20 pages in your site all written to be the most useful to a reader on a topic, your pages can acquire academic relevance (much like the Wikipedia affect).
As a result, if you concentrate the outbound links leaving the page on those editorial pages, they pass on their ranking factor to the target page (via internal links). This means, even if the target page does not have the same amount of content or considerably less than the page linking to it, as long as it has an anchor point (keyword or key phrase or some combination present) it can rank in search engines for those keywords.
In other words, on those 20 pages if they all have a great deal of relevance and you cap the links leaving the page to just a few from the body text. If you point those links at your landing page (which may be 1-200 words and a contact page), you can still have that page take on the attributes of the page linking to it.
Much like adobe ranking for click here (just from them having that as a link on millions of sites) on a smaller scale, your site and the links pointing at it have the ability to shift the reputation and rankings of each page.
This is another clear example of internal linking, but with emphasis on the history of each page and how it evolved from its first introduction and all of the ranking factors that have impacted it, to make it what it is today (which is another page that has the ability to pass on something very valuable to other pages in your site).
In case you are wondering, how many words need to be on a page to capture a competitive ranking the threshold is based on relevance and reputation, each keyword has its own plateau, it is just a matter of finding it, and exceeding it in contrast to your competition.