When making viable shifts to your website or site architecture, here are a few things to take in consideration that can affect existing SEO. Like Murphy ‘s Law, anything that can go wrong will, particularly when migrating a website, making changes to a template or giving your site a new sub folder or segment.
One of the primary ways to impact your websites ranking factor is broken links. Essentially, broken links can leech vitality from your most cherished landing pages (by affecting their support system) and negate the trust your website has established with search engines.
Things like plug ins conflicting or getting reset, 301 redirects or .htaccess files getting corrupted of overwritten accidentally, a typo on a robots.txt file could leave your website with a plethora of errors or deindexed or penalized as a result.
Things like borrowing legacy code (for new segments of a site) without making links absolute, a broken parameter in a naming convention or some seemingly harmless change can set off a chain reaction. If you change the infrastructure of a page, the template, the CSS, etc. you take a fair chance that search engines will set you back until they can ascertain if the weighting mechanisms they use approve or disapprove that change.
Sure, there is a 50% chance that nothing will happen and rankings stay buoyant, but what happens when that 50% probability results in something breaking, then what?
First SEO Tool – Google Webmaster Tools:
If you don’t have Google Webmaster tools, now is the time. From the dashboard there is a section that tells you if the crawlers experienced any errors once you log in.
In addition to seeing if any redirects are not resolving, you can download a link listing in excel for all of the site errors (broken links) present in your website. From there, you can either implement redirects to the real pages or fix the html code to represent the proper naming convention / page and eliminate the error.
You can also view content analysis and determine which pages are being treated as duplicates and consider using a canonical tag to correct link flow or changing the meta data or duplicate shingles on those pages.
Second SEO Tool: Xenu’s Link Sleuth
Xenu’s Link Sleuth is a nifty stand alone tool that quickly assesses the link structure of your website. Just download a copy, unzip it, add the URL and stand back as it spiders your website and looks for broken links. This is particularly useful if you have implemented and content management system and have a higher percentage of link locations which could go awry.
On a side note, it is common for a content management system (CMS) system of shopping cart configuration to allow you to use a prefix or suffix in tandem with dynamically extracted data (such as product numbers, tags, etc.) that are similar across many pages.
If each page has enough distinction in the body area at least 70% difference from the normal cookie cutter template then leave it indexed and active as a page that can be harnessed for SEO. Otherwise, consider using a noindex, follow tag in the meta data to pass link weight to more crucial areas of the site, without cannibalizing your other more pertinent pages.
Say for example if the only real distinction is a mfg. description (which will only suffer from duplicate content penalties unless it is unique) and less than 400 words or a tag page with a snippet, both are great for ballooning your website for topical relevance, but without enough link flow can work in reverse.
We will be introducing a suite of plugins for WordPress to rival the All in One SEO pack, there will be over 40 SEO modules for everything from structuring internal links, finding broken links, adjusting all aspects of meta data, a dashboard to structure canonical tags, distinct ways to implement tag functions, and more.
Stay tuned for the upcoming release and in the meantime, we hope you enjoyed the SEO tips and tactics from the SEO Design Solutions Blog.