Google penalties have become feared by many online marketers over the years. Google is working overtime to improve the quality of the results given to any specific search query. In doing so, they have eliminated a lot of poorly built websites and websites that are of little worth to consumers.
Now, the Google spam team is trying to penalize those who use “spammy” tactics like keyword-stuffing their website, using copy/pasted content or spun content that is illegible, or using automated tools to blast thousands of low quality links at their site.
At the same time, many legitimate businesses who did their best to abide by Google’s rules have also fallen victim to the penalties in one way or another.
Avoiding a Google Penalty
It seems pretty straight forward… Play by the rules and nobody gets hurt, right? Well, it’s not that simple. If you don’t know all of the rules, how can you abide by them? This is especially difficult when Google is constantly updating their rules.
It is important to note that not all Google penalties are created equal. Some are obvious and others are not so easily detected. There are:
Manual penalties – This happens when a quality control specialist reviews your website and determines it to be or very poor quality, in direct violation of Google’s TOS, or consists of “pure spam”. These penalties are among the worst and typically result in the site being de-indexed or removed from Google’s index completely.
Recovering from a manual penalty – The first thing to do is to pinpoint why your site was penalized in the first place. To find out if you have a manual penalty and what that penalty may be, simply login to your Webmaster Tools account and click on Search Traffic > Manual Actions. If you have a manual penalty, it will tell you here. If not, great! Move on to the next section.
If you have a manual penalty, it will tell you what the penalty is here. Learn about the penalty so you can go fix the problem. Then, you will need to contact Google from this same place in your Webmaster Toots Dashboard and request a reconsideration. Be sure that you are following all of their guidelines before proceeding for the best possible results.
penalties – those that are automatically applied when the Google bot crawls your website and detects certain aspects that have deemed to be in violation of their Terms of Service. These penalties can effect individual pages or the entire site.
This will affect your sites rankings negatively – The degree of which is determined by the severity of the penalty as determined by the algorithm itself. It is not uncommon to receive a partial penalty, or even multiple partial penalties. In fact, most sites on the Internet today are effected by one or more partial Google penalties without even knowing it.
While Google is always releasing updates to their algorithm, there are some major updates that have made a fairly big impact on the search results.
This guide will shed some light into what each update looks for and how to avoid receiving a penalty.
The Major Updates
The following updates have become the most well known and have even been given cute names, even though they were not so to many web sites search engine rankings.
Panda – First implemented in 2011, this algorithmic update was aimed at sites with thin content, content farms, and sites that had abnormally high advert to content rations among other on-site quality issues. It took at least a couple of months to completely roll out and effected up to 12% of all websites indexed in Google.
The Panda penalty still exists and has had multiple updates which have been released 3 or more times annually – The most recent of which was released in July 2015.
How to avoid a Panda Penalty – Keep your on-site content unique, legible, and on-topic. Avoid publishing a page with too little content. Google has not shared an official number of words that is considered to be “thin”, but a rule of thumb that has served us well is a minimum of 300 words.
Also, do not worry about stuffing your keywords into the content. If you write like you are talking to a real person, you should be fine. The only thing keyword stuffing your on-site content will do is increase the chances of triggering a penalty. This includes meta data such as your title tags and description.
Finally, keep your advertisements to a minimum, and try to keep them “below the fold” when the user will have to scroll down to see them for the most part.
Penguin – In April 2012, an algorithmic update, also known as the web spam update, was released focusing on penalizing sites that were using “spammy” back links to trick the algorithm to rank it highly. This update has also had continuous annual updates. The most recent of which took place in December 2014.
How to avoid a Penguin Penalty – Keep your back links looking natural – use your band name or bare URL when linking back to your site. If you share content on social media, use the bare URL to point it back. Never buy links, don’t create links that all have your keyword as the anchor text. According to Google’s TOS, if you create a link, it should include a “no-follow” tag which will tell the Google bot not to follow it.
The safest way to go is to distribute your content on social media platforms and when someone finds it useful, they will be inclined to link back to it. Never share your content on Article directories or other forms of what Google has deemed to be “content farms”. Never trade links. Google can see this and will de-value those links. If you have them in excess, you risk triggering a penalty.
Finally, avoid having any hidden text or hidden links on your website. Also, if a site links out to your own, make sure that the links are not hidden from plain site.
While you cannot always control who links to your site, it is important to monitor them. These days there are spam bots that are designed to hack websites and place hidden text or links on it. There are some that would even blast your site with poor quality links.
While Google is aware that this exists, there is no guaranteed way to avoid it. The best we can do is ensure our website security is optimal and routinely monitor the back links pointing to our site.
Hummingbird – In September 2013, Google announced the release of this update. While this update was the most major since 2001 (effecting approximately 90% of search results according to Matt Cutts, head of the spam team at Google), it did not have the drastic effects that Panda or Penguin had. It was more of a revamped version of their algorithm. This is when semantic search was introduced, whereby Google attempts to see the meaning of the content on a page or site.
This has always been and still is difficult for computers to understand. Therefore, all content on a page, and the connected pages are considered to help Google categorize and understand the meaning of it. It looks at all content on the page including image alt text, micro data, and structured data
Avoiding a Hummingbird Penalty – Luckily this is not a penalty per-se. It is a major update that is moving toward a semantic web in which all websites can be fully understood by search engine robots. In order to help those bots to know more about your site, the implementation of structured micro data can be used; the most common of which is schema.org markup language.
The idea behind this is to add snippets of code to your site which cannot be seen by a visitor, but can be read by search engine robots. This will tell the bot what each piece of content is and what it is related to so that it can better be understood, categorized, and indexed.
Now, if you are not good with coding, don’t fret. You can increase Google’s understanding of your content by simply adding more content that use closely related terms, adding related alt text to images on your site, and linking out to related resources.
Pigeon – An algorithmic update released in July 2014 that aimed to increase the relevancy, accuracy, and usefulness of local search results. It improved geo-location as a ranking factor in which they give higher relevance to those businesses who are located closer to the center of the city that they are targeting. For example, it is now much harder for a Seattle Roofer to rank for San Diego Roofing.
This sparked the beginning of a string of updates, that will continue to roll out, targeting “spammy” local marketing tactics. While Google was very busy punishing sites that would manipulate their core algorithm, local websites were still getting away with these very same “spam” tactics.
Avoiding a Penguin Penalty – Follow the guidelines set forth in the above sections. All the rules are basically the same. Triggering a penalty for a local business website may be more difficult to do, but as Google continues to release more Pigeon updates, you can be sure that the guidelines will get stricter. So, play it safe now and abide by their rules to avoid a future penalty down the road.
Mobilegeddon – Released in April 2015, this update was aimed at mobile friendly search results, since over 50% of searches are now on mobile devices. While desktop search results were not affected, many sites that did not meet Google’s mobile friendly guidelines suffered a visibility drop in searches performed on mobile devices. With 50% of searches being on mobile devices, this does have a second-hand effect for desktop searches.
Could this be the first of many mobile friendly algorithmic updates? Only time will tell.
Avoiding a Mobilegeddon Penalty – Make sure your site is mobile friendly or has a mobile friendly version live on the web.
Recovering From a Google Penalty
While it is always easier to avoid being penalized in the first place, inevitably, some websites will get slapped with one penalty or another and suffer a rankings loss as a result. To begin recovery, it is advisable that you first determine what sort of penalty you have. Is it an on-site issue or an off-site issue?
Once you determine the potential problem, it is time to do a clean out of what Google may see as spam. Following the basic guidelines to avoiding penalties listed above will be good enough in most cases. It is important that you look at every single page on your website.
All too often, there will be some pages that you didn’t even know existed. These can include images, ads, forms, etc. Every page needs to have unique, relevant content on it or it should be set to “no-index”.
As for off-page penalties such as poor quality back links – it is advisable that you locate the links in question and attempt to contact the owner of the website to politely ask them to remove the link. If you have no luck having links removed in this manner, you can then turn to Google’s “disavow” tool which will allow you to list links that appear to go against Google’s terms of service and Google will ignore them (for the most part) when crawling your site.
The link removal/ disavow process can be very time consuming. Therefore, it is a better idea to take all precautions to avoid a potential penalty rather than going through the process of recovering from one.
My On-Page and Off Page Elements Look Good – Now What?
Now, you wait… if it was a partial penalty plaguing your website, the next time the Google bot crawls your site, the penalty should be lifted. However, if it was a major penalty such as Panda or Penguin (which you can assume if you had a huge rankings drop), you will need to wait for the next refresh.
Basically, every few months or so, Google will release a Panda and Penguin refresh update to refresh their index of sites. There is no knowing when the next refresh will be and Google does not typically announce them. You could be waiting anywhere from a month to a year… Penguin took one year to refresh between October 2013- October 2014, but many sites rankings were restored.
In the meantime, if you are waiting for a refresh, continue adding fresh content to your site as you normally would. Share it on social media and interact with people as you normally would. When the time comes and a refresh occurs, you don’t want to be left facing other penalties.