What SEO metrics pass the most weight? Search Engine Optimization (SEO) is all about balance within a virtual ecosystem of variables. If one recedes, the infrastructure which supports a competitive keyword could crumble creating vacillations in the SERPs (search engine result pages).
Here are some of the metrics we pay attentions to and why.
- The total number of pages in a website
- The aggregate amount of pages targeting specific keywords
- The ratio of internal links a page has from other (themed and related) pages within the site
- How many external inbound links a page has from other (authoritative or themed) websites
- Site age specifically how old the site is and how long has it been online?
- How many visitors the site receives from referrals
- The type of referrals (authentic or automated)
- The relevancy and relationship of the content (coherence)
- How frequently the site is updated (new content)
- How frequently pages are revised and / or updated
Total number of pages: When it comes to relevance and rankings, size matters. The more relative content you have related to a theme, the more opportunity for you to cross-link those pages to support a common goal. The amount of pages you need to stave your competitors depends on the keywords you are targeting and the aggregate average volume of pages required to off-set their degree of traction, authority and relevance score.
Aggregate number of pages targeting a keyword: Using the Google search operator site:website.com keyword command, you can quickly assess the number of pages a competitor has for any keyword or key phrase combination by using shingle analysis. This will allow you to ascertain a ballpark volume of pages you will need to topple others below the top 10 threshold. We typically look at the metrics from the top 3 websites holding down the top 5 results or the first 3 commercial (non educational) sites to gain perspective on this threshold.
The ratio of internal links: For every page with a keyword occurrence, there is a page that could gain a link from it within your website. If you understand the value of co-occurrence, then you can use internal linking to build links to your preferred landing pages (which means you can rank with less dependency on external inbound links from other sites).
The amount of external links to each page: We refer to this as the deep link ratio and it is always a good idea to have at least 10 inbound links to individual pages. This ensures that each preferred landing page has enough off page ranking factor to toggle search engine inherent algorithms which look for this metric as a basis of a pages integrity. The more quality, themed or authoritative links a page has, the more clout and SERP relevance it can garner. In other words, rank higher with less dependency from on page factors.
Site Age: We often refer to this as the grandfather effect, the older a website is, the more likely it is to acquire trust from search engines. If you sweep the majority of sites holding down top ranking positions, look at the average age of the site as an indicator of authority. It is not always the case, but, typically, the older a site is the better for SEO (and the more resilient it is).
Number of visits: Popular sites have higher crawl frequency, which means, the more opportunity you have to rank new pages or reinforce preferred pages with additional content based on a keyword cluster or market segment.
Content Relevance: Having a coherent outline supported by logical site architecture is one way to score extra points with search engines. The more themed the content is, the easier it is for your site to encroach on a niche, segment or market through the global term weights represented by each page in the site shoring up the theme of the site, subfolder or sub domain.
Post Frequency: Although this is not a constant, we are strong believers in consistency when it comes to post frequency to expedite authority for a domain. There are numerous metrics in play to develop an authority site (which is the basis of SEO), however, time and lots of themed pages are the spine to this strategy. Adding a new page a day will keep you in the game, however, 3-5 pages per day or more will allow you to use those pages in tandem to collectively target asynchronous keywords simultaneously.
Revisions to legacy content: revisions to legacy pages are one of the most overlooked aspects of fine-tuning rankings for search engine optimization. The initial time a page is indexed it makes an impression; however, you can add layers to it over time and the ghost content (which may not even exist on the page) can still appear due to the search engines repository which keeps cached versions of the data from that page in the cloud data used to tabulate that search.
In other words, optimizing legacy content in tandem with unified new content, internal and external links equals pure dynamic ranking potential if wielded with intent.
A new generation of SEO metrics, that’s how. Gauging your success on your positions in the search engine results pages is so last century.
New SEO paradigms, such as the “long tail” and personalized search, call for new key performance indicators (KPIs).
@Vista bay,
I always thought spamming blog comments with Adsense (keyword spam) sites, was so last century?
Who knew…
In your expert opinion, which metrics are “new generation”, since SERPs alone were not the focus, I am curious to hear your perspective?
I am having confusion on some metrices, if google also consider the traffic as a ranking factors then the sites which come top then it will be always top as the top sites get more hits then no ranking sites.
Again if google rank a sign according to the update frequency then in every niche blog should be come first.