“It’s a Google Penalty.”
Followed closely by “we’re out of coffee“, these are four words that every digital marketer hates hearing.
Receiving any Google penalty can be damaging to a website, from algorithmic updates that will decrease your rankings a little, to more serious manual actions that will banish you from the search results. They have the potential to be very harmful to a business, especially those who rely on being found through Google and selling over the web.
We’ve looked at this in the past and have made an updated list to, as the kids put it, get with the times (thanks go to Tad Chef for his original post back in 2011).
The first point to clear up is that there is more than one kind of penalty: Algorithmic, and Manual, and in Google’s eyes, only a manual action is considered an actual penalty. Both are harmful to your website, but will do varying levels of damage. For most users, any drop in visibility and organic traffic as a result will be called a penalty, even if they’ve actually been caught in one of Google’s algorithmic filters. We’ll start with the greater of the two evils: a manual penalty.
Matt Cutts has told us that Google’s definition of a penalty is when manual action is taken against a website – meaning an actual human being has analysed your site, and decided to take action against it. This type of penalty can be extremely harmful for a website, and is only applied when a website is clearly breaking Google’s rules. There are two types of manual penalty actions, ‘site-wide’ and ‘partial’.
The length of time that a manual penalty can be applied for is based on how badly the guidelines were broken, and can only be removed once Google feels the website has taken sufficient action to rectify the issues.
While the penalty can be removed, this in no way guarantees that you will recover your prior visibility, but you will be able to rank once again, and needless to say it will have a substantial effect on your visibility.
The lesser of the two evils, an algorithmic ‘penalty’ is different to a manual one, in so far as it will not show up in GSC, and it will often not throw you from the rankings entirely. Algorithmic ‘penalties’ (technically filters designed to catch manipulative tactics or low-quality content) are results of updates in Google’s search engine algorithm; these updates look to penalise sites for doing something wrong, while rewarding those for doing things well.
For example, Penguin, originally released in April of 2012 was aimed at cracking down on over optimisation of links (overly-commercial and unnatural anchor text, comment spam or large amounts of paid links for example), so sites that looked unnatural were affected, thus rewarding those lower down in the rankings.
How to spot if you’ve been penalised?
Spotting a manual penalty is easy, for two reasons: your traffic will drop, often significantly, and you will find a message in your Google Search Console account.
Algorithmic penalties aren’t so easy to spot, especially which update/filter it is you’ve been affected by. The easiest way to determine whether you’ve been affected is to check your analytics data. You will be able to see where your traffic dropped and correlate this with when algorithms like Panda and Penguin were released. There are certain tools that can help you spot if an algorithmic update has affected your site, like FE International’s.
How to recover
Recovering from a penalty is not easy. If you’ve been slapped with a manual penalty, don’t expect it to go away quickly. You will need to understand what exactly you’ve done to earn Google’s ire, and then prove that you have taken appropriate action to rectify the issue and that you are indeed, very very sorry.
If you’ve been struck by an algorithmic penalty, you need to understand what algorithm you’ve been penalised by, what you can do to help recover your rankings, and then hope for an update or for your site to no longer be caught in the filter.
Let’s now look at the real reason you’re here, to find out what will get your site penalised.
Google states that buying links is against its rules, although it’s still a popular tactic for those willing to risk the fallout if caught, and many site owners are taken in by sellers promising that their paid links are undetectable (hint: they’re usually not).
Buying links can get you penalised by both the Penguin algorithm and in severe cases you can even receive a manual penalty.
To a certain extent, link exchanges and reciprocal links can be natural For example, bloggers regularly link out to other bloggers who have linked to them. The problem comes when you have this happening on a substantial level, and exchange schemes and networks can be detected algorithmically, so you’ll likely end up getting penalised sooner or later.
On a page regarding reciprocal links, Google even ask you to report any sites that are participating in schemes to manipulate PageRank, so this may or may not lead to Google giving reported sites manual actions…
We’ve seen first-hand that brands with large number of links using exact match anchor text get penalised. If, for example, you’re an SEO company, and your backlink profile is made up of links that use the anchor text SEO Agency, then you are, in the words of South Park ‘gonna have a bad time’. Your backlinks should use a natural mix of terms, including a significant number of branded phrases, as well as generic terms, as this is how many people will introduce your site.
An English site that gets a lot of its links from Russian or Chinese websites is likely going to be flagged up. Google is smart enough to compare the language of your website to those that are linking to you, especially if the (translated) anchor text makes little contextual sense.
Not a common occurrence, rented links are where websites pay for links for a certain amount of time, thus increasing rankings. The first, and most serious problem, is that these are essentially paid links, and we know that Google really doesn’t like this, and if you’re caught, it may slap you with a big manual penalty.
The second issue is when you come to end of this period, those links disappear, and you either lose that authority boost, or you keep paying to keep your site up in the rankings, which is a bit of a double edged sword.
Back in September 2014, Google reportedly took action against a number of large blog networks (also known as PBNs), by handing out a number of manual actions to the sites for ‘thin content’ spam.
Private Blog Networks are usually a set of blogs or websites which may be controlled by one party, and the goal of these sites is to build up links within this network to help rank certain content better in the search results. Some webmasters and SEOs use these networks to help artificially manipulate their rankings through either linking to target pages or sites, or even through comment spam, although there are PBNs that work, and are used more ethically as a primary way to rank.
The topic of much debate, guest posting was crucified back in January of 2014, when Matt Cutts declared it all but dead. As with many SEO tactics, it had become overused, at which point Google stepped in.
Why is guest posting seen as bad? It became an easy way to get links on an external website, and in many cases these opportunities were paid for or on websites on unrelated topics. So if you’re paying websites to allow you to write rubbish guest posts with links back to your site, you might want to reconsider.
It should also be noted that guest blogging done correctly is a great way to reach your target audience and to get your name out there. It’s become somewhat of a myth, but actually good guest blogging isn’t dead.
The more links the better isn’t always correct, and even good links gained too fast can result in a penalty. Google not only checks the quality of your incoming links, but the rate at which you’ve acquired them.
If you’re gaining a lot of links over a sustained period of time (I assume that Google must somehow take viral success in to account), this is going to look somewhat unnatural, and Google is likely to assume that you are gaining these links in ways that break their rules. If Google feels you’re getting more links than you deserve (even if they’re perfectly legitimate) you my friend are risking a penalty.
While we want to link our pages together across the site, it’s easy to accidentally make this look unnatural. For example, if you have a link to an external site in your footer that appears on every single page of your website, Google could see this as manipulative. So if you’re going to include these links, such as a link to your developer, make sure it’s nofollowed.
Google hates hidden things, especially links (and text), and it’s a violation of their guidelines. Commonly, web spammers will hide links in a number of ways, including using hyperlinks on a white background so their invisible, using CSS to make the links tiny, or just linking using a single character in a sentence, and these links will likely point to porn, viagram or casino websites. Google hates these types of links because they’re decepetive, and they ruin the user experience.
One easy mistake we’ve seen a number of webmasters make is installing non-verified plugins to their site, without checking the source code. Some of these tools, services, and plugins have sneaky business models where they sneak a hidden link in to their offering, such as a CSS menu, sidebar widget, or visitor counter. We’ve also seen websites receive penalties through links within widgets that had over-optimised commercial anchor text.
Sometimes these links are not only hidden; they are also off topic and downright spammy. So make sure to check the source code of anything you add to your site.
While Matt Cutts has said that you needn’t stress about duplicate content (unless it’s spammy or stuffed with keywords), it can definitely have an effect on your site, even if it’s not classed as a penalty. The problem is that Google needs to know which URL to show in the SERPs, and if it has confusion choosing your content, it might choose someone else’s entirely.
Whether it’s an actual penalty or not, as Dr Pete puts it “If a page isn’t ranking (or even indexed) because of duplicate content, then you’ve got a problem, no matter what you call it”.
The Panda update rolled out in 2011 to target content lacking in substance. When Panda 1.0 was released, it did a fairly good job of understanding when content was ‘thin’, but as time has gone on, and subsequent updates were released, Google has got better at not only understanding the quality of the copy, but more importantly, the value that the content gives the user. One aim of Google’s Panda update was to penalise sites that were serving users with low-quality as well as thin (or shallow) content. Panda targets websites that have a large amount of content lacking depth, and treats it similarly to overt spam techniques. More recent updates, such as the recent Quality Update, have taken this quest by Google to reward content that does a good job of with user experience even further. Panda updates and refreshes happen regularly, and have become more sophisticated.
Examples of thin content are pages with automatically generated content, thin affiliate pages, doorway pages, as well as pages with scraped content or low quality blog posts. Unfortunately it can sometimes be easy to overlook these pages on your website, and Google has stated that it can and will give out manual penalties for this as well, so it’s a very serious issue to look out for on your site.
Google likes to see content that is not only relevant and of a good length, but is going to be of use to your users. So get your whey protein out, do your keyword research, and bulk out that content.
Scraped content (i.e. content taken from someone else’s website) displayed on your website is enough to get you a good old penalty. Don’t do it. Enough said.
Content that’s written in broken English can be seen as scraped and then ‘spun’ (where words are replaced with synonyms to try and fool Google – pro tip: Google aint’ no fool). So make sure a human being can understand what you write, because if a human quality rater hired by Google (yes that’s a thing) might not see it too favourable.
Imagine walking in to a pub and ordering a pint, only for the barman to give you a glass full of Viagra: not cool right? You were expecting a pint, because the shopfront said ‘Pub’, so why did you get a glass full of Viagra? The same goes for websites, rigging your website so that Google sees one thing, and your users another is not cool, and Google hates it. Funny story, Google banned itself for cloaking once.
Another form of cloaking is hiding text within your web page to help it rank. This is bad. This is commonly done by hiding white text on a white background, manipulating the CSS so the text is hidden off screen, or even just placing an image over the text. Google’s pretty good with detecting this these days, and it will not hesitate to slap you for it.
Cloaking is not something that is generally done by accident, and this is why it carries such a heavy penalty. You’re deliberately trying to game and manipulate the search engine results. If you’re caught, don’t expect much mercy.
Article spinning is the process of generating ‘rewritten’ version of content, through the use of synonyms (which hardly ever make sense anyway), and then placing these articles on low quality directories and article websites. Pre-Panda, this would have worked, but don’t even think about it. It gains you nothing (except penalisation), and it gives us SEOs a bad name!
Don’t even. Just don’t.
This has only been hinted at by various sources, but some have said that if you’re caught actively giving out blackhat SEO advice, Google may choose to give your site a penalty, as you’re technically telling people to break Google’s guidelines.
Google doesn’t like doorway pages. They have a ‘long-standing view’ that doorway pages created solely to manipulate search engines, can harm the user’s experience, and because of this they have said that sites which do this may see a reduction in rankings.
Doorway pages are pages (or even whole websites) that are created solely to rank high in the search results, but instead of providing the user with information that they wanted to find, they send them through to another part of the website – like a doorway!
An old tactic, and one that we still see way too often is the over-optimisation of a website’s footer. What websites do is use keyword rich anchor text for internal links as a way of trying to rank for these keywords by using them many times as possible. But, these days, if you’ve got a large number of keyword rich anchor links in your footer, you’re heading for a nice bit of penalisation.
Over-optimisation in any aspect doesn’t work anymore, and it will get you in trouble, so cut it out.
Back in the early days of the internet, and for a long time, keyword stuffing was how your website was ranked. The more keywords = the better ranking. As time went on though Google realised that this couldn’t work, and it was obviously easy to manipulate, so along came Panda to destroy those sites like canes of bamboo.
Search engines tell users to use words that you want your pages to be found for, so naturally for some new site owners they assume that if they just write the same phrase repeatedly in their content, this is going to rank them. Wrong. While you have to go to quite an extreme to see an actual penalty, chances are if your content repeats the same phrases 30 times, it’s not going to be great content. There’s no set rule on how many times is too many, but try to use natural language to avoid this. Google is much better at understanding contextual relevance, and grouping your keywords in buckets, rather than by the actual phrase.
While too many 404s are unlikely to get you a manual penalty, the fact remains that if your site is full of them, Google is going to see this as a mark of low-quality, and this in turn may hurt your rankings through a quality filter.
Keep an eye on your 404 errors in the Google Search Console or by crawling your site with Screaming Frog, and redirect or fix those where appropriate.
No-one uses them, surely… Meta keywords were once a good method of search engines understanding your website through a number of keywords, but of course these began to be manipulated, so Google did away with them.
While Google doesn’t penalise you for these, Bing has said it sees it as a mark of low-quality, so just get rid of them unless your site uses them for internal search mechanics. It will take you 5 minutes and covers all bases, just in case.
This is an unfortunate one, because this is not your fault. If your site is hacked, what usually happens is hackers will insert malicious content on to your website, which can be used to collect email addresses etc. They may also insert links to malicious and spam websites on your popular pages, as well as creating completely new pages on your site, which are optimised and rank for certain terms. For instance we have seen hacked sites where they have created thousands of pages around designer handbags, which rank, then subsequently cause the website to incur a penalty.
Good web hosting companies should notice this and pull down your site before any serious damage, but you can’t always guarantee this. Keep an eye on your site by regularly doing a crawl.
The most recent of Google’s updates, Mobilegeddon was aimed at improving the search results for those on mobile, by ranking those with user-friendly mobile sites higher (only in mobile search results though).
While you will not get an actual penalty for not having a mobile friendly site, you may well see a drop in traffic from mobile users, and depending on how users access your website, this could be a large proportion. Making your website mobile friendly is not an overnight job, but there are ways you can begin to improve.
Buying a new domain is not easy, and unfortunately, some domains can come with a bad history that new webmasters may not know how to spot. There have been countless cases of webmasters buying old domains, only to find that they have been given a manual penalty for shady tactics, and they subsequently have to go through the hard task or having it removed.
Google’s Matt Cutts gave some good advice on this topic, although sometimes it may be better to cut your losses and start again…
Google released a page layout algorithm in 2012, know as the top-heavy update, that was aimed at penalising sites with too many adverts above the fold, or where the content on the page was hard to find. This won’t get you a manual penalty, but can cause your pages to not rank so highly.
So there we have it, a fairly exhaustive list of ways that your website could get penalised!