Have you ever wondered why certain keywords rise faster than others? or what to do when Keywords plateau in the search engine result pages?
After performing SEO on thousands of websites over the years, you are able to discern distinct patterns that all affect why a keyword ranks and where.
For example if a website is older than 4 years old, it has gained a natural degree of trust (if the tags were follow, index and not blocked by robots.txt). Sites that have gained trust for example have certain privileges that younger websites do not (they react to link building with less effort due to trust).
Although these observations are based on personal trial and error, circumstances such as:
- keyword discovery time (how search friendly the site providing the link is)
- the age, authority and link from the site providing the backlink (if the link is to the domain name or anchor / keyword)
- the amount of topical content on the target site
- if the page was static or dynamic
- how prominent the keywords are (a) within the site and (b) in relation to the page in question
Topical relevance is a plus (having clusters of information on a topic) however, sometimes there is no way around waiting for certain factors to age and rise above their algorithmic quarantine like repression in the SERPs.
I have seen phrases that had hundreds of thousands of competing pages acquire a ranking with moderate effort and other keywords or phrases with ten thousand competing pages take months (even though is it a less competitive phrase).
The point is, if a keyword plateaus (gets stuck after rising so far) then you can always look at some of the primary metrics used for producing rankings to assess which action to take to remedy the situation.
For example, if you are able to get a keyword to appear from off the grid (not in the top 1000 pages) and have it scale 900+ positions then fizzle out and get stuck somewhere in the 30′s (on the 4th page of search engines). Then it is time to review the timeline and conditions you utilized to create the proper finesse to allow it to scale the last few pages and reach the top 10.
Performing things like:
- determining how many inbound links are flowing to the page from internal links
- determining how many inbound links are linking to the page
- which percentage of the keywords leading to the page overlap with the target phrase
- if you need to refine the on page factors such as titles, meta data, increase the density of the keyword, adjust the prominence, add bold or H1 tags
- going back to older links and flowing fresh link weight to them to pass more link weight to your pages
So, we know that the ratio of links in to links out to a page and the site, the age of the site providing the link, the age and authority of the aggregate sites and what amount of link juice they pass, the on page factors, the amount of internal links and the quality of the content all have an impact on the algorithm.
However sometimes, if you just let things take their course (the real ingredient of SEO, time and synchronicity) you will find that even the most competitive keywords eventually give way and rise to the top if you are aware of the nurturing process.
The takeaway here is (1) start with a stable platform that is easily crawled by search engine spiders (2) get the on page factors as solid as possible from the onset, build each page to rank for a specific range of 2-3 keywords and variations and (3) build links moderately over time (12-35 inbound links per page) to acquire a keyword that is competitive.