With the top 10 as a benchmark, SEO is all about performance, there are no gray areas (when measuring performance), it either works, or it does not.
They key is to understand how and why problems arise and how to assess the circumstances that suppress a site or page from buoyancy in the SERPs (search engine result pages).
Determining factors such as:
Spider Traps – Ensuring there are no hiccups present in the code for dynamically loaded pages from a database is one of the primary reasons sites hemorrhage ranking potential through canoncalization. For example, if a php programming error from your data generates multiple tags, you could have several versions of a single page (all fighting for link juice and authority) within your site.
If that occurs, the whole site (or that page) suffer as a result of potential penalization from (multiple occurrences of the same content) in the search engines index.
Clutter and more of the same old new same does not add points to your relevance score, on the contrary it only invokes an algorithmic suppression (to maintain the highest quality for the search engine index) as your pages suppress themselves from greatness.
For this reason, ensuring that each title and description tag on every page is unique you are not shooting holes in your sites ranking potential from a simple oversight such as spider traps or duplicate content is essential.
Server Issues – such as loading time (keep it lean) or if the IP address is shared (hosting multiple sites) or dedicated (exclusive IP or Server for your site) or has it been blacklisted in the past from someone abusing it. Sharing hosted space (on the same IP) with an offending site is also a mark against your site. All of these slight nuances matter and have an impact.
Coding Choices – Do your pages use SEO friendly methods or not? For example, ensuring your coding is conducive to optimization standards like using .HTACCESS files (for the main site or each sub folder) to control how each page is loaded or treated from spiders, the use of Robots.txt (to control which pages are included or excluded) or ensuring your server headers return the appropriate response and that your internal links are optimized properly also can have an impact on search engine rankings.
With a trained eye, it is not difficult to assess why a site is or is not ranking well in search engines, but getting to the place where you have the ability to diagnose or prescribe the right solution to each symptom requires a firm grasp of SEO fundamentals.
Traffic alone does not conquer all, you need conversion and user engagement to seal the deal. So, just because your pages have all the right keywords, but send all of the wrong messages or images to convey the total package a visitor might expect, remember to look beyond keywords for something much more concrete…performance.