Are Quality Raters from Google the End of SEO as We Know it? The idea that a manual review of a website or it’s SEO from a subjective human quality rater (working for search engines) can decide which websites to let slide or penalize is a daunting threat for those forced to compete for search engine rankings.
While search engines use sophisticated algorithms to find the most relevant websites, they still are far from perfect and often rely on data harvested or augmented from humans (search quality juries) to refine their extensive ranking variables.
In 2009 Google had over 10,000 outsourced contractors implemented as “quality raters”, using this as a predicate; one can only imagine what the actual number is now and what tasks, questions or potential variables are included in determine which websites “make the grade” or not.
As we have seen from recent Panda update (the nickname for Google’s last update) sometimes, authoritative websites with excellent editorial quality get lumped with the bad less authoritative websites (based on a tweak to the algorithm) and then Google or other search engines suppress or penalize those websites.
Then, to unwind those penalties, search engines send in the quality review / raters to assess whether or not that website exhibits the characteristics of what that search engine considers editorially viable and / or significant to its users.
While the battery of questions that are used to assess websites under manual review are exclusive and proprietary, we know a few from information leaked from Matt Cutts in an interview with Wired magazine such as:
- Would you feel comfortable giving this website your credit card?
- Would you be comfortable giving medicine prescribed by this site to your kids?
- Do you consider this site to be authoritative?
- How would you assess the design aesthetic of this website?
To think that the outcome of these questions can either rank or tank your business (in addition to data pulled from the black list of domains from Google Chrome) at least for Google is adding another layer to the SEO process.
If the website passes the editorial review, then a potential penalty can be lifted, on the contrary if it does not, then it can be flagged for suppression.
On the flip side, if the role of the quality rater is sent to sweep and objectively grade websites that are being considered for penalties, what if that person is in a bad mood (from numerous potential stress triggers), or what happens if “a search juror” has a vested interest or personal prejudice about the site or type of website they are grading?
Should someone’s website who is (a) either oblivious to SEO or (b) doing their best to rank in a competitive market have all progress erased and be sent “to the back of the line” with a manual review penalty? While this may be a few questions and answers to a quality rater, on the other side of that site are vested hours and often famility from employees to feed, bills to pay to keep the business going and tons of debt hanging in the balance.
In short, how many people do quality raters “take out” inadvertently as online or brick and mortar businesses are being forced to fold since their one low cost promotional method (their website) is now nowhere to be found.
Years ago, DMOZ had the most noble intentions of sniffing out quality and since it used volunteers to perform the rating, who is to say that there was not a hidden agenda lurking in their midst.
My question is two-fold (1) will search engine juries (a.k.a. quality raters) fall prey to the same potential corruption or power trips of DMOZ editors and (2) what can you do to protect your website from being checked off the list?
The natural Google friendly answer: (1) Never build links, focus on quality content and have your “#@%#$” handed to you from never ranking since others are blatantly bending the rules and (2) “if you build quality content, people will link to you”…which sounds like a paraphrase of the first.
As noble as those intentions are, markets are fiercely competitive, web stores and the like have to compete with dynamically generated behemoths like Amazon and others and on top of the fact that people rarely link to a shopping page puts e commerce at a distinct disadvantage.
While creating great content is a pivotal portion of achieving rankings (and getting past algorithmic or manual review) SEO is far more complex and online promotion is becoming more difficult by the day.
While I fully understand both Google and webmasters, there needs to be some common ground in ensuring quality thrives and sites worthy are rewarded.
In this light, I hope that quality raters can bring a “more human side” to ranking algorithms and be able to look past a few promotional or “required” competitive thresholds (such as building links) to push past the noise to get a website to page one.
Follow the links below for more information on the topic
- http://digitaldaily.allthingsd.com/20090603/google-and-the-evolution-of-search-scott-huffman/
- http://digitaldaily.allthingsd.com/20090605/google-and-the-evolution-of-search-iii-whats-next-in-search-much-much-better-search/
- http://www.seochat.com/c/a/Google-Optimization-Help/Googles-Quality-Rater-Guidelines-Leaked/
It seem like there is a real battle going on. Google versus every other business. It’s a bit like how the criminals often stay one step ahead of the police. I the same way that spammy sites will stay one stay ahead of quality businesses.
I would rather say that this battle is between Google and SEO specialists. Poor people have to change their tactics all the time only because of changeable Google. Some new modifications are totally contrary to the prior and make your previous SEO policy useless or even fatal.
you are right jeff. Content is king, but seo specialist will have other ways to break it. Maybe we have to change our priority to submit our site to another search engine, especially to local one. Yes. It’s hard, very hard way, but we must to start it.