How The Newest Penguin Update Affects Your Link Building Strategies

Shares

In September, Google finally rolled out the latest Penguin update, dubbed Penguin 4.0.

With it came major changes to the algorithm and how they rank websites, while working to penalize websites that use “blackhat” methods for building backlinks, and that manipulate the search results.

To understand Penguin 4.0, take a trip down memory lane and look at the previous Penguin updates.  It will help you understand what you need to be doing (and avoiding) going forward, if you want to keep sucking in that coveted search engine traffic like a vacuum cleaner.

April 24, 2016: Penguin 1.0

In the first Penguin update, Google came forward and stated that they were actively targeting “webspam” — or webmasters that were actively attempting to manipulate the search results.

They defined “webspam” as:

  • Keyword stuffing.
  • Linking schemes.
  • Cloaking, Redirects, and Doorway Pages
  • Purposeful Duplicate Content

While these may be old techniques, around before Google even existed, they started implementing changes to their algorithm that would allow them to automatically detect these patterns and penalize the offenders.

Even though they actively tried to do away with these types of techniques, there were still multiple examples where their algorithms got close, but didn’t quite hit the mark.  Spammers even went on a mission to prove that Google didn’t have their act together, and the results could still be manipulated.

Google stated that roughly 3% of their index would be affected by the changes, but trackers showed that the changes had a much higher reach.

This would just be the beginning of Google’s attack on over-optimization and shady SEO techniques that had been working for years.

May 26, 2012: Penguin 1.1

Google didn’t necessarily come out and give this algorithm update a label, but Matt Cutts did announce on Twitter that they were refreshing their index to reflect new findings, and help push more webspam out of the search results.

After the second update, more people in the SEO industry were starting to debate whether the algorithm update helped, or hurt the search results.

This was due mainly in part to the large number of “whitehat” sites that had been caught up in the algorithm.  During the weeks after the initial release of the Penguin algorithm, multiple examples of legitimate sites experiencing drops in traffic numbers skyrocketed.

While no new additions were added to the algorithm during the Penguin 1.1 refresh, it was implemented to reduce the impact on legitimate sites, while increasing the pressure on websites that were using shady practices to rank higher in the search results.

October 5, 2012: Penguin 1.2

On the afternoon of October 5, 2015, Matt Cutts took to Twitter again to issue a statement on the latest “data refresh” — aka: Penguin 1.2.

The first in a series of tweets from Cutts alerted the SEO industry that a new update was on it’s way and to expect flux in the search results.  He also stated that the newest refresh would affect less than .3% of all search queries, making it the most minor update to all of the Penguin refreshes.

The Penguin 1.2 rollout was primarily aimed at foreign language search results that spammers were still using shady ranking practices in, but it had a much larger effect once trackers were able to collect enough data to put together accurate trends.

While the initial Penguin updates were targeting manipulative practices that affected searchers, the latest refresh spent more time focusing on the links that were coming into websites.

If the algorithms determined that a website’s link profile consisted primarily of “low quality” backlinks or that the anchor text distributions were pushing the limits of what is deemed normal, websites were severely penalized.

The penalties imposed were on a sitewide basis, so websites that experienced drops in traffic were left wondering which pages may have caused the issues and what needed to be fixed.  A lot of people abandoned their websites after this refresh.

The latest update also caused problems with whitehat websites that had pages that had gone viral, dramatically increasing the amount of links to those pages while lowering the overall ratio of links to other pages across their website.  This imbalance was criticized by many in the SEO industry, but Google made no statement about why so many whitehat sites were being affected.

May 22, 2013: Penguin 2.0

Realizing they made major mistakes and caught a lot of legitimate websites in their crosshairs with the original Penguin rollouts, Google tweaked their algorithms to better identify what they believed were webspam practices.

Rather than simply refreshing the index using the older Penguin dataset, Google implemented new technology under the hood of the latest Penguin update.  The changes on the latest update were so big that Google’s internal teams began dubbing it “Penguin 2.0”.

Matt Cutts made an announcement on Twitter that the update would affect less than 3% of the English search queries, but, again it affected substantially more.

Cutts came forward and stated that users should focus on building websites that are helpful to visitors, attract natural links, and continue creating high quality content.  As long as you did that, you would always be in Google’s good graces.

Unfortunately, that wasn’t necessarily the case, because Google caught even more legitimate websites in the update and demoted their rankings.

This update was more comprehensive than the original Penguin updates, and focused more on other practices like advertorials and paid linking strategies.  It’s speculated that this is why so many websites got caught up in the algorithm changes, because it was impossible for Google to actually determine the difference between a paid link and an editorial link.

October 4, 2013: Penguin 2.1

The latest update was a refresh similar to the early updates in that it merely tightened up the limits of what was considered “unnatural”, but did spend more time focusing on websites that were hacked, websites that focused on some of the spammy search queries, and websites that were using blatant blackhat backlinking strategies.

It was argued that Google spent a lot of time on blackhat forums and message boards during this update, because websites that were using the latest linking strategies were directly impacted when the refresh happened.

While Google stated that it would primarily affect spammy search queries, like payday loans and pornography, SEOs across the board saw massive flux in their search rankings and traffic.

Matt Cutts did publicly state that the newest refresh added changes that would allow them to more easily detect when a website had been hacked, and provide the owners of those websites with the necessary resources to clean up the hacks and prevent having their rankings demoted.

He claimed that less than 1% of the overall search queries would be affected but, again, this refresh had a much further reach than what he anticipated, or let on to the public.

This update also attempted to address the massive negative SEO problem that the original Penguin updates had allowed to happen.  SEOs were beginning to realize the same techniques that got their websites penalized could be used against their competitors, and that Google didn’t have their act together enough to spot the difference.

Because the Penguin 2.1 algorithm automatically demoted websites that were considered to be using blackhat linking techniques, negative SEO ran rampant during this time period.

Google recommended using the disavow tool to remove links they’ve found that you considered to be spammy, while using the data you give them to update their filters to more easily identify true webspam and differentiate it from negative SEO.

October 17, 2014: Penguin 3.0

Google spent more than a year allowing the previous Penguin update to work its magic, while steadily collecting data and using it for the update they released on October 17, 2014, dubbed Penguin 3.0.

The sixth release of the Penguin algorithm wasn’t given a name by Google themselves, but was deemed to be 3.0 because of the length of time since the previous update.  The amount of time in between refreshes caused Google to take a lot of criticism from the SEO industry, mainly because of how the algorithm actually works.

During the previous updates, if your website was caught in the crosshairs (whether legitimately or not), you were forced to wait until the next update to figure out if the changes that you implemented were sufficient enough to get you back into Google’s good graces.

That meant you had to wait an entire year to learn if your website was going to be allowed to rank again, while web spammers were building new sites and taking over the top search result spots again.

As of October 20th, the update had been completely rolled out.  Once the rollout was complete, trackers began noticing that not many new changes had been implemented into the core algorithm, but that this was a mix between another refresh of their index, while tightening the factors on what they considered to be a “natural” link profile and anchor text distribution.

What changed with Penguin 4.0?

During previous Penguin updates, you were forced to wait until the next update or refresh to find out if your site was still in Google’s good graces, or if you were going to be caught up in their strict webspam guidelines.

With Penguin 4.0, Google announced that the updates would become a part of the core algorithm and happen more frequently.  This gives webmasters the chance to clean up their act and find out much sooner if the changes they’ve made are sufficient enough to reclaim their original rankings.

Penguin 4.0 also works on a more granular basis, using a mixture of signals to determine what is considered spammy and what isn’t.  Rather than affecting the rankings of the entire website, Penguin 4.0 works on a page by page basis and will devalue single pages that are caught spamming, instead of tanking the entire website.

Google states that this update makes it much harder for spammers to determine what is an “acceptable” level of spam, and what they can get away with, but the SEO industry disagrees.

This algorithm update opens the door for spammers to push the limits until they’re caught up in the penalties, then disavow and back off slightly to find the new boundaries.  The instant updating nature of this algorithm change makes that possible.

What do you do moving forward?

You need to focus on all of the factors that Google wants to see if you want to avoid finding yourself caught up in the latest Penguin update.  While future updates may tighten what they consider spam, the current algorithm you’re dealing with has baked in factors that are very well known.

Shady linking practices, overuse of keywords in links, links from low quality websites, and other factors all determine your overall link profile.

However, the newest algorithm update focuses a lot more on the other signals that your website has to determine if your link profile is worthy of being labeled as pure spam, or not.

If you have high quality content that solves visitors problems, lower than industry standard bounce rates (suggesting your website is helpful), and high quality links sprinkled in with your spam, you can push the boundaries even further than you were able to during previous updates.

With the newest Penguin algorithm being constantly refreshed, if you do happen to get hit with a penalty, ranking demotion, or a “pure spam” warning in Webmaster Tools, dial back your aggressiveness until you’re released from the penalty, and then try again.

In my opinion, this algorithm update opens Google to even more spam linking strategies than previous updates have fixed.

Leave a Comment: