in SEO Resources by Jeffrey_Smith

Often you hear about the need to prevent duplicate content within your own website as it applies to SEO, but why? This is our take on why creating uniquely distinct nodes on each page is imperative for your content to produce not only context, but rankings as opposed to tripping search engine filters.

duplicate content

Block segment analysis dictates the value of weight / ranking  factors a web page is given for a set portion of the page. For example, text in the header and footer of a document are treated differently than say text in the body section of a page.

The header is given more weight than the footer and typically the body text is given more value than both (since that is where the page has the ability to distinguish itself from borrowed elements, i.e. navigation, etc.).

The reason for this is simple, page composition dictates that proximity and distinction (the more unique one document is from another) by distinguishing your content, the better. Search engines can easily discern duplication (through shingle analysis, singular value decomposition, etc.) in areas such as navigation, sidebars/blogrolls with the same links and footers all using the same text. This has the tendency of diffuse a pages ranking potential.

In order to leverage a page for SEO, it must (1) get indexed and (2) stay indexed in order to pass its ideal ranking factor, be used to bridge ranking factor as a hub by building internal links to and from that page. If search engines determine that your pages are redundant or ubiquitous the dampener kicks in algorithmically and the ranking factor can be suppressed via de-indexing or filters (like the supplemental index) as a basis of sequencing leaving the same repetitive hash mark (from document to document).

This aspect of the ranking algorithm is based on the c-index whereby term weight and similarity are used to assign relevance through singular value decomposition as a base vector across web pages. If enough of the singular values are identified across the global collection of documents then you have co-occurrence (which may in this instance work against your ranking objectives).

Duplication via a template or though tag pages that lack unique content can go supplemental in a secondary search index, if they are not nurtured with enough internal link flow. The gist here is, the larger your website grows, the higher the probability of diffusing or diluting your global nodes / top ranking keywords (a.k.a keyword cannibalization) like an over-optimization penalty.

Just like a base that shifts to accommodate scalability, the foundation (like a triangle) must continue to broaden its base if the vertical threshold continues to rise. Similarly, the more pages you add about (1) topic A shift the focus from topic B. The common thread in most websites is the navigation (which if left to a simplistic ontology is not sufficient to feed an entire website).

You can stem a websites topic into multiple topics, however each has a threshold to cross and moving on many fronts requires more inbound links and internal stability though secondary navigation to sculpt the on page factors that identify each segments (to pass value).

This is why virtual theming (linking from relevant keywords in one document to another document) is so important. It allows the ranking factor to transfer link weight to the champion page through term frequency which essentially raises the bar in how search engines will interpret each respective keyword when considering it for retrieval.

Deep linking (acquiring inbound links to a specific page vs. just the homepage) is equally as beneficial for eliminating the tendency for pages to go supplemental (or lose ranking factor as a unique asset). When you spread a site too thin (expand the content through automation) such as using a shopping cart that pulls similar values from a database. You must ensure that you have the ability to customize data (titles, h1, and content) to make each page distinct enough to add additional leverage to your primary, secondary and tertiary keywords that define your website. 

Read More Related Posts
SEO Tips to Step Over Competitors
Contained in this post are practical SEO Tips to step over complacent competitors by targeting the least common denominator with SEO. This, in turn, churns lackluster contenders off the page ...
READ MORE
SEO Marketing, Authority and Scalability
From the perspective of SEO, each domain requires a different type of tact, depending on where it is in the cycle of developing authority. However, SEO is only one channel ...
READ MORE
What To Expect From SEO
With all of the variables that can affect a website and impact search engine rankings, organic search engine optimization has inherent benchmarks used to determine how and what type of ...
READ MORE
Rankings Slip? Assess Metrics to Minimize SERP Volatility
If you closely monitor your SEO rankings, from time to time a ripple enters the SERP’s (search engine result pages) and shakes things up a bit in the index. You ...
READ MORE
SEO Dependency – Do Your Rankings Rely Too Much on The Home Page?
Are your search engine rankings depending too much on one page (like your homepage) or one sub folder of your website? If so, here are a few tips to spread ...
READ MORE
SEO Basics: How to Bait Search Engine Spiders
Search engine spiders thrive on fresh content and websites with an abundance of PageRank, authority and pizazz. Instead of making SEO more complicated than necessary, sometimes you need to "go ...
READ MORE
PPC Landing Pages or SEO Landing Pages?
If you ever wanted to make SEO landing pages rank like Pay Per Click (PPC) landing pages, then this tutorial is for you. If you understand the implication of this ...
READ MORE
Duplicate Content and SEO
With SEO, sometimes you hear about duplicate content and duplicate content issues.  These “issues” are real and unless corrected, it is like splitting hairs and losing a large percentage of ...
READ MORE
Finding the Proper Balance of SEO
Finding the proper SEO ratios for content, links and competing pages is critical for determining where each keyword fits in the hierarchy. This correlation between site architecture, URL, title, internal ...
READ MORE
SEO Tips To Optimize a New Website
What are the most critical SEO modifications to implement during the preliminary phases of optimizing a new website? Sitemaps and site architecture, links, naming conventions or D) all of the ...
READ MORE
SEO Tips to Step Over Competitors
SEO Marketing: Social Structure and Scalability
What To Expect From SEO
My Rankings Fell: Dealing with a Slip in
SEO Dependency – Do Your Rankings Rely Too
SEO Basics: How to Bait Search Engine Spiders
Make SEO Landing Pages That Rank Like PPC
Duplicate Content and SEO
SEO Ratios for Content, Links and Competing Pages
SEO Tips to Optimize a New Website

About Jeffrey_Smith

In 2006, Jeffrey Smith founded SEO Design Solutions (An SEO Provider who now develops SEO Software for WordPress).

Jeffrey has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and marketing strategies to increase rankings, revenue and reach.

13 thoughts on “Block Segmentation, Duplicate Content, SEO and IR

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *