The state of the Internet has changed. Google has acquired so much power through their domination as a search engine that when they make a change, entire businesses can crumble. We can see a stunning and ongoing example of that in the form of Google’s Panda string of updates.
Panda Intentions
Google has the best in mind when they design their updates. Their motto is ‘don’t be evil’ after all. Every decision they make when they push out an update is designed to cut the traffic driven to sites that try to game the system. It’s supposed to cut back on blog networks with no existing content solely to drive ad revenue. Anything you might find listed under a “black hat SEO” category is penalised. Sites that exist only to copy the content from legitimate sites are penalised as well — though sometimes less effectively. It’s hard for a search engine to tell where the content originated at a glance, and occasionally it will punish the wrong site.
The problem is that the entire SEO industry is based around Google’s algorithm. No offense to Bing, Yahoo and the other search engines, but Google is king. Businesses with websites rely on SEO to place them highly in Google’s search results, which in turn drives traffic to their site. Traffic turns into customers. Every time Google shakes things up, some tactics that worked before might not work, or may actively penalize the site employing them. Remember meta data tags and keyword spam?
Thin Content
Panda, as Google’s most recent string of updates are collectively known, is focused mostly on punishing sites with very little content. The goal is to filter out sites with no content, very little content, or content that is either directly copies or spun. Spun content is simply copied content with enough of it changed to make it appear unique. Google hates spun content, because in their minds it’s used simply to work around their algorithm. While it’s true that there are a few legitimate uses of spun or copied content, for the most part Google is right.
The sites that Google targeted with Panda are the pseudo blogs, blogs that scrape and spin content for the purpose of creating entire networks of linked sites. These networks can have a powerful influence on other sites when they offer backlinks — or did, before Panda. Google discourages artificial enhancement of search rankings, so they are trying to remove the influence of these networks.
The problem lies in identification. Two sites side by side have one hundred pages, each with very similar content. Which one is the fake one? Without a human eye looking at the pages in question, it’s hard to tell which one is the pseudo blog with a hundred pages of spun content, and which one is the commerce site with a hundred product descriptions of similar products. In one too-wide bit of judgement, an entire business can lose their pagerank and a significant amount of traffic.
Likewise, syndication and the process of “reblogging” have both become things Google penalises, whether or not it was intentional. Perfectly legitimate networks, particularly those of news stations, tend to syndicate the publication of their content. The same content can be posted on multiple sites, because these sites all target different audiences but are owned by the same group.
To Google, that simply looks like a single piece of content being copied by a network of blogs. That’s because it is, but the use is more legitimate. Drawing the line is one thing Google is constantly working on and trying to improve. Again, Google doesn’t want to penalize legitimate sites, but because of the vast amount of data they process every day, they have to make broad strokes.
Working With the Panda
Google’s Panda update is here to stay. We’re long past the point of no return. That means business owners, blog writers and anyone else running a legitimate website needs to work with the Panda. Fighting it simply means a longer time with lower pagerank and lower traffic. Webmasters and business owners need to audit their sites as soon as possible to make them work with Google’s new rules.
The biggest thing Google is penalizing is too-similar content. In some cases, this is easy to fix — simply make the content more unique. Unfortunately, that’s not always possible, as in the previous example of product descriptions. The solution for now seems to be adding more unique content to the page around the content that has to be static, to lower the percentage of content that looks copied.
Copied content off-site is another huge problem, but is one that’s much harder to fix. When a site run by a Nigerian prince pops up, copying your content to use as part of a pseudo blog, how do you fix it? You can change your own content, but that’s not a permanent solution — new scammers can simply copy the new content as well. Besides, that makes the scammer’s copied content not look copied. Google has a page dedicated to reporting scraper pages, but if your content is old and has been copied many times, it can be very time consuming to report it all. Copyscape offers a service that searches for duplicated content, but it can be very inaccurate if it searches too broadly.
Unfortunately, in the end it all comes down to the constant generation of new and improved unique content. Your old content is going to be copied eventually, and it’s going to hinder you if you don’t have anything new. Old pages that still generate traffic can benefit from a little work to make them unique, but new unique content is more likely to keep your pagerank up. Combine that with a dedicated effort to remove low-content pages and spruce up pages with little unique content, and you can make up with Panda in no time. Google wants to see you succeed in online marketing; they just don’t want others to ride your coattails.
Your content in any local SEO strategy is important, especially when it comes to helping customers make that final purchasing decision. Speak to Online Ownership for a no obligation chat.