When calculating its rankings, Google relies on algorithms which are designed to get smarter over time. But the technology isn’t always enough. Particularly in cases of user feedback, it’s still necessary for the actual teams to get involved and adjust the algorithms accordingly. There’s also the Webspam team, responsible for dishing out manual penalties. But what are these penalties exactly? And, more importantly, what do websites do wrong to deserve them? Read on to find out!
Algorithmic downgrades vs manual penalties
Google downgrades websites when one of its algorithms detects things it considers to be negative. Google Panda and Penguin are examples of algorithms which concentrate on quality of content and backlinks.
“If you get caught by this sort of penalty, it takes a lot of work to recover.”
To make matters worse, webmasters are not directly informed of the penalty. They will notice a downgrade in their website’s ranking but then often have to try out various tests in order to identify exactly what they’ve been punished for.
Manual penalties on the other hand are often the result of a spam report, read manually by Google staff and acted upon when necessary. The good thing about manual penalties is that useful background information is often provided in the search console, making it easier to find out what you’ve done wrong and how to recover quickly. This can be done by correcting the relevant issues and submitting a re-consideration request which, if successful, lifts the penalty.
However, don’t think that once a penalty (manual or algorithmic) is lifted, your website will rank just as well as it did before! In most cases, previous good rankings were a direct result of whatever dubious tactic was being deployed. Now that that approach has been changed, a key reason for the good rankings has disappeared. The only way to recover is to work on improving the quality of content. Remember, just because a website has been penalized once doesn’t mean that Google will never trust it again! On the contrary, the search engine wants webmasters to stick to white hat techniques and produce good content which can then be rewarded with good rankings. We’ve put together a few common stumbling blocks which you should aim to avoid.
Unnatural backlinks
Unnatural backlinks – the classic mistake! Everyone knows that backlinks are essential for good SERP rankings, but they need to at least seem natural. Otherwise, it won’t be long before the Penguin or a Google staff member comes across your site and manually downgrades it. When this happens, you’ll need to use the disavow tool in the search console to correct the problem. In this case, it doesn’t hurt to divide links into different categories and, if necessary, submit several re-consideration requests until you’ve got all of them.
Doorway Pages
So-called “doorway pages” are another big issue. These are usually created when users search for the same services from different locations or for similar services from the same location. Google doesn’t see any value when pages are almost identical apart from the URL, so they either ignore all the pages or the single website behind them receives a manual penalty. In this case, it’s best to bring several regions together since five similar pages will present less of a problem than fifty! Or, try adding different information to regional sites which is unique to a specific branch. This shows both Google and the user that your sub-pages contain genuine value.
Cloaking
Cloaking refers to the practise of showing search engines different content to users. This can be done by re-directing users straight to a new page upon arrival or by displaying a paywall. While the former should be avoided, the latter can be used in conjunction in with a first-click-free rule. This means that users from search engines only see the paywall after a certain number of clicks.
There are exceptions, of course. If an age check or something similar needs to be displayed for legal reasons, this is ok for Google too. If content is personalized, the Googlebot should be treated as a first-time visitor and shown the same content as any user visiting the site for the first time. Cloaking isn’t as black and white as other techniques on this list; it depends on the situation.
Spam in structured data
The first impression made by an organic search result determines whether a user clicks or not. This is where structured data comes into play, giving search results a description or star rating. Of course, this is open to spamming so you need to make sure that star ratings, for example, only apply when they can be backed up by the content on the relevant sub-page. Keyword stuffing in titles and descriptions should also be avoided.
Otherwise, Google begins to mistrust all content derived from structured data and can just ignore snippets. Ultimately, by trying to convert an average snippet into a great one, you could end up losing the whole snippet and generating an overall loss. So think: is spamming worth it?
Thin content and spam
Let’s finish with another classic: thin content and spam. This includes everything from keyword stuffing, white-on-white content, pointless SEO texts and any other tactics which belong in 2005. They might lead to a decent ranking for a while but rarely result in anything consistent or long-term. Sooner or later, Google will notice and downgrade your website with the Panda or with a manual penalty. Content is king! There really is no other way!
Conclusion:
Instead of wasting your time trying to trick the search engine, it’s revamp your entire approach. After all, what’s the point in tricking Google into sending thousands of visitors to your site if they then fail to convert and simply disappear again? It might take more effort but it’s better to invest time into creating a brilliant website which is informative for your users and technically perfect for the search engine. Then your rankings will take care of themselves.