Word Count and SEO – How Many Words are Enough?

Ken Lyons from metarocketseo.com recently raised a valid point with a comment on our post SEO and Internal linking about how we arrived at the number of words required to effectively rank a web page using SEO.
seo-wordcount
The comment was, “You say: “Obviously, the more content you have the better (250-300 unique words per page are ideal)…” “I’m curious as to how you arrived at this word count per page as ideal? I’ve ranked pages with 60 words and pages with 600.

To me, its seems to be more about what your competition is doing in terms of content volume and less about ideal word count. I’d love to hear your thoughts.”

Ken great points, as there are multiple SEO ranking factors, one of the more prominent ranking factors is, how many words does a page need to acquire a competitive ranking using organic SEO?

I wanted to address the reply from a few unique standpoints. The first impression your page makes in the index (when search engines discover your page) is important. Also, what happens every time you edit that page either raises your relevance score or lowers it based on the reputation that page has.

Making a strong first impression can set the tone of how that document is treated, for example, if there are enough instances for a key term, then you have more leverage to wien that term later and supplement it with alternative tactics (off page reputation via links, internal links, etc.).

It used to be possible to rank pages with no relationship to the keywords (like a blank page or a page with a few sentences), however Google has corrected this portion of their algorithm.

I typically suggest 500-750 words on a page for clients (for their preferred landing pages) or at least 250-300 unique words to set the tone from the keywords and co-occurrence they produce. The idea is to use related synonyms and supporting keywords (based on a theme) for targeting the keyword with that page (in the title, h1, in the first 25 words on the page, etc). Why you might ask? Two reasons, on page and off page reputation.

You can always go back and edit the content to ensure it serves its purpose as a landing page, however, in order to capture the maximum number of phrases that can funnel visitors back to that page (from a variety of keywords) you need to make changes over time.

For consideration you can think of it in terms of.

1) Contextual reference
2) Topical relevance
3) Archive Value

When talking about what is appropriate as far as word count when determining the tipping point for a page, aside from competition, your own page needs to have some type of affinity and relevance with the search terms you wish to be found for. The amount and type of content (including word length) are your first opportunities to sculpt this effect with search engines.

When a ranking is produced in a Google search, it is not just the current factors (existing links, existing content) that are contributes to the way that page ranks. Various iterations of the page have all left impressions on the index and those impressions are then used to categorize the range of keywords that are considered relevant for the search terms that embody the context of the document.

Magnify this by the fact that ever since your page (or pages) has been online (or indexed in search engines) every revision has been saved and cached and stored in a data cloud over time.

This repository of edits, content and links are all calculated into relevance score and the amount of authority your page has for the said keyword combination. However with one twist, search patterns can be augmented by content that no longer exists on the page in present tense.

Just because the content you see on a page now doesn’t mean that those components are responsible for its present ranking. So, you see, this is where perception can work against you if you are simply topically skimming a web page and conducting a competitive analysis.

Sure, you know what the page is now, but how many cycles of modifications, edits and unique shingles exist on the page now, that are really responsible for its aggregate ranking factor?

So, the value of creating context and ranking for lesser terms or low hanging fruit is two-fold. One, to create the reputation, to get that data in the cloud (across multiple data centers all showing unique snapshots of the content) and two, to eliminate the need for having it on the page later.

When you consider that with this one SEO technique alone, you can essentially replace PPC advertising if you understand the depth and implications It contains. Once you rank for a specific keyword and achieve the velocity to get a page a high relevance score, that page will have a relationship with that keyword that makes it easier to create a synergy with later.

So, you can add layers of optimization to a page, then after it is indexed for a set period of time, go back and make modifications to the page to optimize it for visitors and push the search engine optimization factors to more concentrated components (like a meta description, tag or title) which is enough to sustain the on page relevance.

For example, our website was optimized for the keyword SEO Visibility for nearly a year, the keyword did not yield fruitful results, so we decided to remove the references from our home page. Guess what? the page still ranks for that keyword (depending on which data center feeds the results) despite the fact that the word visibility has been completely removed from the page. However, since the word SEO is still on the page with enough frequency, the algorithm uses that as the new anchor to produce the ranking.

The implication is, you can optimize a page for a term, then change the page over time (once you hit the top 10) to make it the ideal landing page for conversion without the long 1000 word thesis /encyclopedia summary attached. Supporting pages can carry the weight of relevance as well, which means you can prop the page up with as many internal pages or combination of internal pages and external links as it need to remain relevant and buoyant.

On Page Edits / The Application

So, say you start with a 1000 word page in the beginning, you let the page get indexed (with one series of titles, meta data and prominent keywords) then after it is indexed for a period of time, you go back and make revisions.

Each time you make a revision (adding a few keywords, changing the order of prominence, altering meta or h1 tags, etc.), let the page get indexed and then make a snapshot of the backup (just in case you need to conduct a keyword density analysis later to keep the ratios for specific keywords relative).

Document size can be scaled while still maintaining the same keyword density and concentration of topical relevance. If you remove extraneous words, then make sure the gist remains intact, if you don’t need a word on your landing page for example, then don’t use it. Sculpting your content and message takes time, but each time you make a revision, you page gets the benefit of what it was and what it is now.

Even one word found anywhere on a page is enough to allow that page to still rank for multiple variations of keywords (even If they are not on the page now). The same applies to links (one present, they leave a mark) for both internal and external links (since they are algorithmically now fused with your document based on the ratio of link flow that got transferred as a result of the transaction), yet that is another post in its entirety.

Under this premise, you can specifically observe the cycles of indexing (how frequently a page gets picked up and re-indexed). If you want, you can grab a shingle (5 -10 words in exact match) from that page and add it to Google Blogsearch and then, when the page gets indexed, you are notified via email.

To reiterate, just take a unique sentence (from something you just changed) add it to the search “in quotes” and then let Google alert you when it crawls the page.

Also realize that if you target attainable phrases with a specific page and that information is no longer on the page, you can target additional phrases and the combination between the old and the new will allow your fresh content to rank higher (based on the ghost/data cloud memory of the page).

An Exercise for Mapping out Co-occurrence for Keywords

Start with 10 instances of a keyword on the page, then reduce it down to 5 times on the page, then reduce it down to 2 occurrences while shifting more emphasis to the meta title and description for the key phrases.

Or

Starting with one page, like most do, then adding more pages on the content (but controlling the flow of links). For example, if you have 20 pages in your site all written to be the most useful to a reader on a topic, your pages can acquire academic relevance (much like the Wikipedia affect).

As a result, if you concentrate the outbound links leaving the page on those editorial pages, they pass on their ranking factor to the target page (via internal links). This means, even if the target page does not have the same amount of content or considerably less than the page linking to it, as long as it has an anchor point (keyword or key phrase or some combination present) it can rank in search engines for those keywords.

In other words, on those 20 pages if they all have a great deal of relevance and you cap the links leaving the page to just a few from the body text. If you point those links at your landing page (which may be 1-200 words and a contact page), you can still have that page take on the attributes of the page linking to it.

Much like adobe ranking for click here (just from them having that as a link on millions of sites) on a smaller scale, your site and the links pointing at it have the ability to shift the reputation and rankings of each page.

This is another clear example of internal linking, but with emphasis on the history of each page and how it evolved from its first introduction and all of the ranking factors that have impacted it, to make it what it is today (which is another page that has the ability to pass on something very valuable to other pages in your site).

In case you are wondering, how many words need to be on a page to capture a competitive ranking the threshold is based on relevance and reputation, each keyword has its own plateau, it is just a matter of finding it, and exceeding it in contrast to your competition.

Or Stick Around and Read More Posts

8 Comments

  1. ken lyons
    Posted February 2, 2009 at 5:28 pm | Permalink

    Jeff,

    Good post on word count, and thanks for the mention.

    This post actually sheds some light on indexing issues I’ve been having for one page in particular, where despite my best efforts to get the page ranked for a specific term, Google still displays another page for that term. My guess is because Google still associates the term with the other (legacy) page because the client said they did a lot of link building for that specific term with anchor text pointing to the other page. In short, it looks like I’m trying to undo what they’ve done with respect to the keyword-page relationship Google has established.

    Thanks,
    Ken Lyons

  2. Jeffrey Smith
    Posted February 8, 2009 at 11:12 am | Permalink

    Hey Ken:

    My suggestion is:

    1) use the existing page for the purpose of authority by creating a new page and cap the links on the original page to less than 10 outbound (including no-following navigation)

    2) then with the ideal balance of meta, on page and off page relevance (20 pages of internal links) based on the search command site:domain.com keyword , find 20 relevant pages to link to the new page from to give it the needed link weight.

    3) then build some links to the new page and observe as it takes on the attributes of the collective link flow from the internal and external ranking factors.

    This way, there is no legacy off page factors standing in between you and the top 10. The only thing you have to wait for is the process of osmosis to occur. Not to mention you can get a double listing in the meantime which is a great way to increase conversion by 200%.

  3. Morten
    Posted January 28, 2010 at 6:14 pm | Permalink

    Hey
    Thanks for this information.
    I´m from Denmark …. so my English is not that good …. hope you understand.

    I have a question:
    Is it better to have a lot of page about the same issue …. I can see you have a lot page about “SEO” and you are using it in the title-tag on every page … is this not a bit risky …. or is it the way to do it?

  4. M. D. Vaden of Oregon
    Posted December 7, 2010 at 11:43 pm | Permalink

    This is the first opinion I’ve read on my current quest to study more about “unique words” on a page.

    Experimenting may be one of the best ways to find out for sure.

    MDV

  5. Jon Sterling
    Posted May 10, 2011 at 5:36 am | Permalink

    Great information, we are constantly tweaking our pages for better ranking, sometime though we shoot ourselves right in the foot in getting to aggressive.

  6. David
    Posted June 8, 2011 at 6:51 pm | Permalink

    The numbers doesn’t count.

  7. Jobs in Cyprus
    Posted August 22, 2011 at 2:02 am | Permalink

    It depends whether you are talking for the home page or secondary page. Definitely search engines don’t like to see an empty page full of links and on the other hand a large scrolling page that no one will bother reading. At least half of the page (in normal fonts) is enough. As long as there is no much repetition of the keywords (it will be considered spam). If you follow the above it will do the job!

  8. ingatlanok
    Posted February 11, 2013 at 5:35 am | Permalink

    This piece of writing will help the internet people for creating new weblog or even a weblog from start to end.

4 Trackbacks

  1. By SEO and Adaptation | SEO Design Solutions on February 4, 2009 at 9:51 pm

    [...] the past we have discussed SEO principals like legacy pages and the IR repository, how to funnel link weight and sculpt on page factors, internal links, deep [...]

  2. [...] on page changes are more significant than others. For example, with the lingering impression of legacy content, links, and the search engine cloud computing bin (the cumulative summary of each page in a website since [...]

  3. [...] on page changes are more significant than others. For example, with the lingering impression of legacy content, links, and the search engine / cloud computing bin (the cumulative summary of each page in a website [...]

  4. [...] Aim for a minimum of a few paragraphs, but try to stick to 1,500 words or less. I usually shoot for 250 to 800 words for search engine optimization purposes. When people read on computers, their attention spans tend to last a shorter time than they’d [...]

Post a Comment

Your email is never shared. Required fields are marked *

*
*

Web Analytics