Penguin 4.0: Everything you need to know

Algorithm Update Roll Out

Unless you were living under a rock, you couldn’t have missed the single most important piece of SEO news last week- the Penguin Update known as ‘Penguin 4.0’.

After close to 700 days of waiting, Google decided to quietly roll out tests and deployment of this much anticipated update. Webmasters from all corners of the world saw highly unstable hikes and dips in their search rankings, which is what is known as the ‘Google dance’ .

The dancing search rankings were different from the usual types of spikes and decreases, as these results would fluctuate several times a day for a period of 7-14 days depending on who you ask.

Google finally broke it’s silence on the matter by having an official announcement that the update was live and in real time, a few days after Google’s John Mueller gave a response saying that there were no announcements about an update as far as he knew , much to the chagrin of webmasters everywhere.

What exactly is the Penguin Algorithm?

The penguin algorithm is a complex set of algorithms targeted at off page signals. While this in itself is a very broad spectrum, the majority of off page signals are in fact link building.

In a bid to stymie the widely used practice of procuring paid links, Google designed this algorithm to determine if the links pointing to a website were unnatural or irrelevant to it’s niche and topical interests.

While it isn’t perfect, the Penguin Algorithm has been single handedly responsible for a major portion of manual penalties handed out to webmasters.

Webmasters who procured links through link exchange schemes, blatant and very obviously paid articles in the form of guest blogs, links from PBN sellers were all hit with either a massive loss of SERP rankings or an outright manual penalty ever since the penguin algorithms earlier versions were launched.

It is important to note, penalties from the previous versions of the Panda algorithm were handed out on a domain wide level-while for Penguin 4.0, Google has stated that this latest update would be on targeted at the offending page itself with no negative consequences to the other pages within the domain.

What does this mean for webmasters?

How concerned should you be with this algorithm update? Should you care even if your site didn’t see any negative changes the past week?

Many SEO forums such as Blackhat World or ‘BHW’ for short, have been ablaze with speculations and discussions. The general consensus is that those who did see negative impacts from this update were those with competitive and commercial niches and keywords.

The chatter centred around this update is not limited to just Black hat forums, everyone from SEJ to Search Engine Round Table have been discussing their findings vigorously.

Now more than ever, it is highly crucial for webmasters to take a closer look at their ranking tactics and existing backlink profiles.

As Google has claimed that the algorithms is constantly rolled out as compared to previous versions, this could mean both good and bad news for any website.

If a website doesn’t have a constant velocity of incoming links, the webmaster could possibly neglect the upkeep and tracking of it’s link profile. This could lead to the algorithm evolving in due time and revealing that site often, and detecting that nothing has been done about certain toxic links that are deemed undesirable. In the long run, multiple crawls detecting no positive changes could affect the website’s ability to retain it’s rankings, much less rank higher for it’s desired keywords.

On the flipside, because the algorithm is active in a perpetual state of deployment, webmasters could easily submit a file of toxic links using the disavow tool and see a much quicker response the next time the website is crawled.

Other implications 

Earlier this year, Google had released news of it’s brand new project called ‘Rankbrain’, which is simply defined as an artificial intelligence system centred around machine learning.

It’s official announced purpose was to help Google’s search engine to further refine it’s capabilities of associating articles and contextual content to search queries to give its users a better experience during search sessions.

Sounds great doesn’t it?¬†

What this really means for webmasters is that no longer can they simply get cheap and low quality or spun content and use it for their link building campaigns.

Rankbrain’s capability at sussing out poorly structured content with grammatical errors will ensure that the links being built using that type of content, would either be outright devalued or lead to a non-manual algorithm penalty.

Hence the difficulty levels in trying to rank organically in search result pages has just been raised.

Webmasters need to be very conscientious when procuring links. A badly produced piece of content may not be the end of the world, but if it is used to build links on a spammy looking website that has absolutely nothing in relevance to the site it is linking out to, could bring about devastating results handed out from the combined abilities of the Rankbrain and Penguin 4.0 algorithms.

Fans of tiered link building campaigns often generated by using link building tools should also be cautious when using low quality content. The human readability of the article isn’t the only thing you have to worry about, you now also need to contend with making the article both readable and relevant with the use of latent semantic indexing or ‘LSI’ so that the machine learning systems can attribute the content to the actual topic of your site.

Actionable Steps:

Keeping your site safe from manual penalties dished out by the penguin algorithm isn’t simple even if you don’t participate in link building schemes. Many good quality and authoritative websites often find random websites linking out to their content for no reason at all.

With that in mind, here are some things you can do to keep that pesky penguin in the artic wasteland where It belongs.

It is highly recommended to use multiple search tools such as ‘Ahrefs’ , ‘Semrush’ and others to detect your existing backlinks.

Using the data from these tools, take a look if the links are spammy or totally out of relevance to what your website is all about.

Make a spreadsheet of these undesirable links that could be harmful to your site, and send out requests to the owners or admins of those sites to request the link to be removed.

Upon confirmation of which sites have acceeded to your deletion requests,¬† update your spreadsheet to reflect the URLs of those who haven’t complied or even given a reply and submit it into your search console (formerly Google Webmaster tools) under the ‘Disavow’ tool.

It is recommended that you attach a message stating that you have repeatedly attempted to contact those webmasters to remove your links, but to no avail.

Be aware though, while certain links may look absurd or spammy- removing them could cause your ranking positions to fall significantly.

Only use the disavow tool if you are absolutely sure of what you are doing. There are various guides on the internet to help with link assessment and the usage of the disavow tool.

The disavow tool is a very handy function, but could be dangerous if poorly executed.