Has anyone else noticed that Google is crawling deeper into the web to discover relevance and link popularity for the pages that have “the right stuff” to be included in their index. As an SEO, it is not uncommon to have multiple alerts set up and various types of monitoring tools active to assess the scope and range of your websites impact.
Approximately a week ago, I kept receiving alerts of pages that were appearing as new content that had been created months ago from both our own site as well as multiple sites linking to ours. At first, I thought it may be a glitch, but in reality it was just the spiders ingesting a larger piece of the web, one page a time.
Does Crawl Rate and Crawl Frequency Matter:
To answer this bluntly, absolutely. The more topical pages you have referencing your theme (topic of website) or the more pages you have to fortify or reinforce your main topic, the more relevance your pages can garner for those related keywords.
This alone can translate into exponential traffic when you consider the difference between a page teetering on being deindexed (removed or demoted) and a very popular page that continually funnels visitors is reputation. That reputation is impacted greatly by the way your content is deep linked and if that content is discovered by search engines and deemed worthy.
A website with a faster crawl rate has the opportunity to introduce content more frequently and thus reinvigorate languishing pages to nurture them back to health in less time. Websites with authority often get spidered and indexed in a few hours. More pages in the index (if optimized) equals more opportunities to attract more visitors.
Granted, SEO is not always complicated, it is the combination of the simple underlying factors that contribute to the overall process that should have equal importance to any webmaster or aspiring business.
Post frequency (the rate in which you add or change content) toggles search engines to revisit your content. The value of re-optimization (making tweaks and adjustments to refine on page factors) is a never-ending process to refine performance.
How this deep link recrawl will affect existing rankings in Google is unknown, what is known however is by deep linking (even though it may take longer to get discovered) is one of the most valuable strategies you can implement to shift the viability of your websites credentials to search engine spiders.
The Googlebots mission is simple, to leave no page or no link behind, at least for an initial inspection. You never know where or how your pages will be discovered and you can never truly control the extent of how search engines evaluate your content.
The conclusion is, don’t leave anything out there you wouldn’t want to share, so keep a tidy server (redirect pages that are dead weight) and continually refine your theme to overcome any adversity of off page reputation that may hinder your exposure.