Google Doesn't Care if it Ranks Your Site Properly

It's true. Google doesn't care if it ranks your site properly.

Well, if you are Amazon, eBay, USA.gov, MIT.edu, or some other site like that they do care. But, if you aren't, they don't.

OK, maybe Google cares a little, but their focus and priority comes from an entirely different direction. This is one of the hardest things for many publishers to grasp.

How often have you heard a publisher complain about Google whacking their rankings even though they are a quality site? That they never spammed anyone or anything. Have you ever been the one doing this complaining? I have spent countless hours listening to people complain about these things to me, and trying to help find ways to understand what the issues are for their sites.

Google is a company that produces a service. It is a search engine called "Google" and they work hard to make their product better.

Google has a rigorous process for measuring the quality of their service, and for measuring improvements in that quality when they make algorithm changes or updates. In December of 2011 Google released a video of a Search Quality Team Meeting. Watch it for more information on their processes.

How the Process Starts

Google collects data on search quality problems on an ongoing basis. They have many sources for this, including spam reports from publishers, articles published about search quality problems, and their own search quality raters.

Google continuously collects this data. While I'm not sure how they manage the details, at some point they get a collection of problems that they decide they want to address.

Once the pick a potential project, they assemble a known set of problem results, or test cases, and some ideas on how to fix them. Once a proposed fix is implemented, the test cases are used to see if the algorithm indeed addresses the problems. This is only the start of the process, as Google also does rigorous testing to make sure that other results are not made worse at the same time.

Google also does live data testing with a small sample of the overall search population. This testing gives Google real world metrics that show whether search quality was improved as a whole.

From conversations I have had with various Googlers, I have reason to believe that they also watch for an acceptable amount of errors. For an extreme example, if a new algo improved overall search quality a lot, but Amazon fell out of the search results, they would still adjust the algorithm before release. If you spend some time and think carefully about the general process, you can make the following observations:

Some or all of the test cases are explicitly addressed.Overall search quality went up.Any measured errors are within acceptable norms.Nowhere in that conversation is your site included (unless you are in one of the test cases). This means that the impact on your site is simply a byproduct of the algo change. To Google, how your site fared does not matter, because their overall search quality went up.

Your site may be an unintended casualty of the entire process. Your site was (hopefully) not one of the test cases. Your site was not individually examined. All Google knows is that they made their service better.

Algorithms and Signatures

If your site has been hit be an algorithm, or is hit by one in the future, it means that it shares some characteristics with the types of sites that Google was targeting. Google targeted a specific set of test case sites, and overall search quality went up. Therefore, all the impacted sites have something in common with the test case sites.

I refer to this as having a signature that Google associates with poor quality sites. It doesn't mean you have a poor quality site, just that you have something in common with them.

Let me illustrate with an example. I know a site that was hit by Penguin 1.0. The site had all original articles, and these articles covered topics in depth that are not covered by many other sites (if any). The site has never done any spammy link building, or anything of that kind.

However, as the sole exception to the preceding statements, the website had submitted articles to three different article directories. Zero money changed hands, the articles submitted to the directories were actually well-written, and no small animals were hurt in the process.

An innocent webmaster thought that they might get some exposure through article directories, and got whacked. Why? Because submitting to article directories fit a signature of poor quality sites that Penguin 1.0 identified and targeted.

The site has addressed those problems and has made a 100 percent recovery. The only link cleanup that was done was specific to the article directories. The publisher now knows to avoid that behavior, and the site is doing great.

Check Your Signatures

If you do get negatively impacted by an algo update, try to figure out what your site has in common with other sites that were targeted. You can find data from sites such as Searchmetrics on winners and losers after algo updates. Even though you may have a great site and the identified losers may be crappy sites, you have something in common with them.

This may be a very difficult exercise, and it's a bummer to have to do this work. But finding the bad signature and fixing it is the key to your recovery.

No comments:

Post a Comment