How to Avoid SEO Over Optimization

Just because you read a few SEO blogs or heard from a friend of a friend of a friend who once heard that changing meta tags is all it takes, or just because you know how to write great content doesn’t make you great at SEO.

How to Avoid SEO Over Optimization

How to Avoid SEO Over Optimization

On the contrary using too many overlapping search engine optimization elements on a page can also send the wrong signal to ever watchful algorithms and trigger an over-optimization penalty.

How can you tell if you have been hit with an over optimization penalty?

  • De-indexation (a loss of pages in the search engines index).
  • Rankings plummets (either for specific keywords or groups of keywords).
  • Your web traffic dries up to a mere trickle or shadow of its former volume.

SEO requires a delicate balance between what is considered relevant, natural and popular from the perspective of off-page SEO (link velocity) and peer review.  This requires impeccable continuity for on page considerations such as how to employ internal links, prevent duplicate content or structure superb navigation, page templates or site architecture.

Get any of these metrics wrong by overdoing them and you’re likely to incur a penalty. First you should understand the parameters of optimization before knowing which metrics to focus on and in what order.

How Far is Too Far?

I have seen everything from people abusing the noscript tag to stuff links, creating 1 pixel tables and keeping the text the same color as the background (to hide content), stuffing alt text to optimizing CSS values with keyword-rich anchor text to get a boost in search engines.

The point is, things taken past the point of moderation are a red flag waiting to happen. We understand that competitive markets exist and not everybody plays fair (such as database driven dynamic insertion based on keyword clusters, RSS arbitrage, cloaking and other gray areas), but the last thing you should do is follow suit in an attempt to inflate relevance.

There are enough challenges to contend with online, the last thing you want to do is contribute to your own demise. Here are some useful tips to avoid over optimization penalties and earn your rankings amongst the other more authoritative websites in your market or niche.

Relevant and Succinct Titles:

Keep titles focused to what users can expect to find “on that page”. Rankings are by the page and a title that is stuffed with keywords from other segments of the site can get that page flagged for stuffing.

For a perfect example of relevant titles, url’s (universal resource locators / the web page address), h1 tags (what that page is about) and internal links, just refer to any page in Wikipedia.

Linking The Same Anchor Text to Multiple Sources:

If you have multiple occurrences on a page with the same keywords anchor text (the text in the link), then try and make sure that they do not all link to the same place. The only exception to this rule might be in the main site navigation if you link to the homepage with the word home, then link from the body text to the homepage with a keyword rich-anchor.

Search engines are smart enough to distinguish navigation blocks within your website template from the body area or contextual links vs. links in the sidebar or footer of your website; but there is no need to intentionally or unintentionally create links to different locations with the same anchor text. The rule is, one link location per anchor per page (not 5 links on the page with the same anchor trying to link to 5 different pages).

Minimize Outbound Links where Applicable:

The more links you have leaving each page, the more link flow that page passes to each of the links. Unless your pages are replete with excessive link flow, then try to consolidate a top down approach or use relevant lateral linking (from relevant page to relevant page) to reinforce topicality of your theme (page focus).

In layman’s terms if you have a real chunky navigation with 150 links leaving each page, none of those pages win (since they are hemorrhaging potential ranking factor through dividing the link flow) as well as those recurring shingles (groups of words) also skew the page focus.

Less is more when it comes to navigation or structuring links in to links out on a page by page basis. Although traffic reaches your website from multiple keywords, use the minimal amount of navigation (enough to get visitors back to a category page or hub page), but not too much to eliminate the possibility of that page providing ranking factor for itself of other pages.

Link Metrics, Varying Anchor Text:

Link clusters leave a trail and aside from IP diversity, the most prominent mistake for many SEO wannabes is to overdo the inbound links to a page with the same anchor text. Understanding that link velocity and link discovery time all play a part in uncovering automation or link inflation tactics, the last thing you want to do is overdo it and build too many links too fast with the same keywords.

Once you’re in the penalty box, you have to figure out which type of filter you may have tripped to get back out of it. What you lose in the long run is not worth the short term gain you may benefit from acquiring links from bad neighborhoods, stuffing keywords, using tons of me too pages with nearly indistinguishable content to inflate search engine ranking factors.

The best SEO strategy is to understand the appropriate balance of site structure, relevant navigation or server side includes, the proper use of alt attributes and images, best practices for internal links and not to abuse off page ranking factor from excessively or disproportionately getting a small series of inbound links based on a redundant keyword or similar keyword cluster.

Or Stick Around and Read More Posts

18 Comments

  1. seo company india
    Posted February 25, 2010 at 12:51 am | Permalink

    Sometimes due to over optimization Google spam website and also page rank decrease one should be balance while doing SEO good post.

  2. Justin
    Posted February 25, 2010 at 8:30 am | Permalink

    Thanks for the post Jeffrey. How is it possible to tell if you’ve been hit with an over-optimization penalty? I have a site around 14 months old with some pretty good rankings, but other pages that were getting towards ranking for relatively competitive terms seemed to go from around #15 to #300 in the SERPS. This happened for a group of competitive terms I’m trying to rank for.

    Might it be an over-optimization penalty, or could it be that they appeared higher in the SERPS for a few days for another reason? I’m not too sure whether to carry on building links to those pages or leave them for a while.

    Any thoughts would be great.

    Once again, thanks.

    Justin

  3. Jeffrey Smith
    Posted February 25, 2010 at 8:38 am | Permalink

    Hi Justin:

    With so many variables to assess it would require some analysis.

    1. are the same pages in the SERPs still showing for the same ranking? or is it another page (with possibly less relevance now selected)?

    2. Have you lost some pages sue to too many pages being similar?

    3. What are your naming conventions and titles like? Unique, somewhat unique?

    4. Did you make any major changes to links internally?

    5. Did you push the link velocity too far too fast? or use all the same anchors from a questionable block of I.Ps or get linked from a bad neighborhood, etc?

    You have to dig in to metrics such as this to determine what type of penalty before you can unwind it…

    I suspect you may not have enough unique content on the individual pages to “stand out” enough, but I would have to know more to determine what occurred.

  4. Justin
    Posted February 25, 2010 at 10:37 am | Permalink

    Thanks for the reply Jeffrey. I completely understand it must be difficult to determine what exactly happens in situations like this.

    The sudden rise in rankings seemed to coincide with a site redesign which involved altering site navigation. I had 3 main categories (eg, shoes, jumpers, hats). All sub-categories under these 3 categories would appear in the sidebar on every page. To try and make the site better organized I changed it so that sub-categories would only appear under their parent category (more of a silo appearance).

    After this the rankings shot up for a few days, then went back down, which may have just been a coincidence as I remember it happening maybe a day or two after the redesign (barely time for Google to re-index and notice the changes I’d guess).

    My naming conventions always follow the same pattern, eg:
    Keyword – Black Leather Shoes
    URL – /black-leather-shoes
    Page Title – Black Leather Shoes, Loafers and Slip-Ons

    I always make sure the page titles are unique from page to page, and each page has at least 400 words of unique content. In regards to your first point, I have noticed this on a number of occasions, when a page with less links to it and not optimized for a certain keyword, replaces the page I was trying to rank for in the SERPS (but far lower rankings)!

    I think I’m going to spend time on this site and get the design, navigation and internal linking properly sorted out as I think the site has a lot of potential. I might be in touch soon for a quote to see if it’s something your company might be able to help with.

    Once again, thanks for your help and time Jeffrey, it’s very much appreciated.

    Justin

  5. SlimJim
    Posted February 25, 2010 at 12:31 pm | Permalink

    Jeffrey,

    “Minimize Outbound Links where Applicable”

    I have a ? about having to many links on a page.

    I have on my blog/cms site a navigation system that list all the categories, & a few other links (67 total links on almost every page – standard) all setup the exact same in the header, I really need those links/navigation so that the end user only has 2 clicks to any webpage.

    You really have me thinking about this, & I think I should change the links/navigation before I move along with my blog.

    I’ve done some research about my #1 competition & what I found is when you visit their site they have links/navigation setup almost like I already have, then I got to looking at their webpage source code, & found they are hiding all those navigation links inside javascript.

    Alexa says my competition ranks in the top 1000 sites on the web, so hiding the links inside javeascipt isn’t hurting them at all, they are #1 in the SERPs for the single best keyword in this niche.

    Their sites Home-Page inside the Google Cache (Text-Only version) isn’t showing any of those 60+ links, so since they are not showing inside the Google Cache Text-Only webpage, I take it Google can’t see all those links, that would otherwise leak link-juice thru out the entire website.

    My ? is, what do you think about hiding links inside javascript?

    I don’t want to sound like a hack, trying to hide all those links, because really they are needed & every link is all internal, so I’m not spamming myself here… ;)

    I’ve also checked the #2 & #3 sites in the SERPs, they both have their navigation setup like I do, & all those links are also showing in the Google Cache Text-Only version, just like mine.

    My thoughts are the #1 website in the SERP is on to something, & maybe his homepage is ranking so much higher with the lack of so many links/navigation (as far as Google can tell) leaving his homepage…

    Thanks, for this blog post, it really has got me thinking! :)

  6. Jeffrey Smith
    Posted February 25, 2010 at 12:52 pm | Permalink

    To Justin and Slim Jim:

    I love practical questions like these, it makes writing these posts worth while to provoke thoughts such as these.

    Slim Jim…

    I know Google crawls but does not always execute javascript so, aside from the fact that those keywords are appearing as content (which could skew on page focus) they are not leaking link juice like a heavy navigation schema.

    How is the #1 site compensating for the lack of connectivity between pages (a) deep links from off page ranking factor (b) contextual links or (c) are they ranking from authority and global term weights?

    I would look into the age of the site, the # of pages indexed and see if you can unravel their keyword thresholds on a per keyword basis as well using site:domain.com keyword (to see what they are working with to cross the tipping point)…

    Also, you may wish to reconsider flattening your site architecture a bit and consolidating 2 metrics (such as sub folder and page / product) into one metrics closer to the root.

    Try this with one category first which will accomplish 2 things (1) it will reduce the need for the heavy navigation and link loss site wide from the 67 item nav leaking on the sidebar on each page and (2) it will allow you to condense an unnecessary folder by making the naming convention (more-comprehensive-using-relevant-anchors).

    This way, you need less inbound links from other sources to create buoyancy in the SERPs and the pages are easier to feed from contextual links…

  7. SlimJim
    Posted February 25, 2010 at 2:28 pm | Permalink

    Jeffrey,

    What they did is, the entire navigation is hidden inside javascript (No-Show inside Google Cache Text), except for a single link on every webpage including the homepage that leads to a sitemap/webpage that list all the links to each of the category pages. All the category links are hard coded on the sitemap page (no-javascript like all the other pages).

    So they still have all the navigation for Google to crawl. In reality they have 60+ links + the single link on the homepage leading to the sitemap.

    Might sound complicated, but really it looks very simple, they just have the javascript (hidden links) on almost every page (except the sitemap page).

    The #1 SERP site was created in “2000″… With 72,000 pages in Google SERPs

    I realize this is a David & Goliath situation, LOL. Since I have less than 200 pages indexed, ha :)

    ************************************
    My category permalink is setup like this:
    http://MyDomain.com/category/category_name

    My single post page is setup like this:
    http://MyDomain.com/page_title

    I also might add that the #1 site has old URL structure for each category page, something like:
    http://www.MyDomain.com/theme.php?cat=001

    Thanks, I always enjoy your feedback!

  8. Micheal Lee
    Posted February 25, 2010 at 8:41 pm | Permalink

    thanks for the supporting comments for this article/blog. I’d like to know though from which authority, book or even experience this article came from. Im really curious about specifics of being over on SEO. I wish that seach engines would post official specifics. =)

  9. web design and development
    Posted February 26, 2010 at 12:56 am | Permalink

    In some issue over optimization is also counted as spamming and its maybe there are some unexpected reasons why some site disappear with a high rankings.

  10. link building
    Posted February 26, 2010 at 1:29 am | Permalink

    Thanks for some nice guide. I think if we build lots of link with a short period of time, it also like over optimization. So we should try to make balance on link building.

  11. SlimJim
    Posted February 26, 2010 at 5:09 pm | Permalink

    Well, I took the plunge…

    I changed 20+ links that show on almost every blog page & have them inside a javascript file.

    So I’ll wait & see how Google responds, then move on from their.

    I tested a few pages in a 3rd party spider sim. tool, & none of the new (javascript) url’s are showing.

    I’ll do another blog post, to get google to crawl…

    I still need to add a full sitemap.

  12. SGD Networks
    Posted February 27, 2010 at 3:49 am | Permalink

    Hi

    There is another case. I found some sites got penalize by google for the reason putting lots of links in social networking sites. So be careful working with social networking sites.

    Bets Regards
    Venkat

  13. Jade Slair
    Posted February 28, 2010 at 7:23 pm | Permalink

    I try to avoid an obvious funny business and do vary my link title a little but they are still similar.

  14. Custom Creative Design
    Posted April 5, 2010 at 4:31 am | Permalink

    Over-optimization isn’t optimization at all; it’s search engine spam.No legitimate SEO technique will ever be considered search engine spam because real SEO enhances a site overall. If your pages get booted after a big algo change, revisit the techniques you used and think about your SEO efforts exerted.

  15. Jeffrey Smith
    Posted April 8, 2010 at 10:04 am | Permalink

    The point of going too far in an attempt to optimize is the context. For example:

    1. changing all css values to keyword-rich anchors
    2. using all H tags 1-6
    3. Adding a link from every keyword occurrence to a target url vs. just once per keyword occurrence
    4. Adding footer links from each page (with similar shingles)
    5. Using no-follow tags excessively

    etc.

    Proper optimization is effective, but if you go overboard where a common user is perplexed trying to navigate around all of the “enhancements”, then you have pushed too far… Search engines understand context, synonyms, internal links, co-citation and hundreds of other agreed upon / normalized patterns. If you are outside of those parameters, then you take a chance of sending a foreign signal, which could result in demotion if the areas are gray.

  16. Randy
    Posted January 11, 2011 at 2:42 am | Permalink

    I experienced the same issues as with Justin.

    I have an 11-year old website and started link building for competitive keywords 6 months ago. During the 4th month, my rankings were nowhere to be found on Google!

    Is it also due to over optimization for targeted keywords?

    The rankings never come back up to now (2nd month) while the other rankings were replaced by other pages that we did not optimize but in a lower position.

  17. Arnaud
    Posted October 9, 2011 at 4:11 pm | Permalink

    Is that considered OVER-OPTIMIZATION, ie spam?

    A “celebrity” navigation drop-down on all the pages of each of 600 independent celebrity web site. Each site has thousands of news pages, ie drop-down “links” from thousand of pages to each of the 600 web sites.

    Would Google consider these drop-down “links” as “too many” links? Even though these drop-downs are meant to ease navigation from one celebrity site to the others , ie, not spam at all.

    Thanks for any feedback, guidance.

  18. Jeffrey Smith
    Posted October 15, 2011 at 6:22 am | Permalink

    Try using javascript to obfuscate those drop downs from getting crawled and appearing as on page content or internal links. Excessive internal links bleeds vital ranking factor, so, it depends on if you expect a user to find your webpage through navigational (on site) means or search engines.

    If you want those pages to rank, you will need to rethink your strategy for preserving that vital link juice.

    Hope this helped.

7 Trackbacks

  1. [...] I have recently compiled a few posts on the topic of over-optimization and de-indexation and ways to stave this type of penalty off or defer it entirely. This is a matter of duplicate [...]

  2. [...] website could be suffering from over-optimization or from linking out to a bad neighborhood (untrusted [...]

  3. [...] By the same logic, sites with mammoth amounts of content can be scaled back through creative selection of which landing pages serve as the apex and which pages are designated for the SERPs (search engine result pages) and which pages are more apt for human consumption. SEO is a balancing act, but you have to plan accordingly and understand the pitfalls of flawed performance (if you let the pressure off too soon or exceed the necessary force and risk over optimization). [...]

  4. [...] minimal variation). Failure to add variety to inbound or internal anchor text (links) can lead to over optimization as a result of link velocity and how people link from the [...]

  5. [...] is a metric you simply can’t afford to overlook, and about that “off page SEO thing”, yeah, over optimization is real, but just keep in mind, now, it’s not just about links, it’s about [...]

  6. [...] As a result, we previously implemented a site cap feature for Deeplink juggernaut module which allowed you to prescribe a link cap for internal links to prevent potential over optimization penalties. [...]

  7. By How to Get Dropped Rankings Back! | Clixto7 on March 28, 2013 at 11:23 pm

    [...] 9)      Diversify Internal Link Anchor Text: While you think that linking the same keyword from every page in your website to one target page might help, if done in excess, you can toggle an internal over-optimization penalty. [...]

Post a Comment

Your email is never shared. Required fields are marked *

*
*

Web Analytics