How Algorithms Have Affected Buying Backlinks Through The Years

Shares

Google would love to do away with link spam, buying backlinks, and building backlinks, but they just can’t do it.  That’s why they’ve worked on updating their algorithms throughout the years, in an effort to identify people that are manipulating their search results.

One thing that has remained common throughout all of the algorithm updates is the fact that backlinks are one of the biggest ranking factors that Google uses to determine who shows up at the top of their search results.

Look through how the algorithms have evolved over the years, and you’ll get a clear picture on what Google has attempted to do to decrease the volume of link spam in their index.

Unfortunately for Google, backlinks are baked into their algorithms, and they will probably never be able to get away from that fact.  How you build (and buy) backlinks does need to evolve to stay with the times, though, because Google is getting smarter about how they identify spam and the signals that they use to rank websites.

2000: Google Toolbar – The Dance Is Born

In December, 2000, Google released the Google Toolbar.  This gave webmasters a goal post that they would need to hit, and helped them determine the strength of the pages in the top 10 search results.

Making Toolbar PageRank public was the official start of the “Google Dance” because it opened the doors for smart SEOs to learn how to manipulate their search rankings.

While the Toolbar PageRank was eventually phased out and made private, the information that was publicly available sparked some of the most controversial debates in the SEO industry.  This caused massive fluctuations in the search results, and the “Google Dance” was officially born.

2003: Cassandra Update – Goodbye Invisible Links

One of the most well-known methods for generating backlinks that weren’t visible to users was to use invisible links on pages.  This allowed massive stuffing, without the user being able to notice that the page they were visiting was actually full of spam.

This is a practice that will have you facing an automatic penalty these days, but was achieved by changing the font color of the links pointing to your website into the same color as the background they were displayed against.

This was also the first in a series of major updates that actively targeted how backlinks were being built (and bought).

2004: Florida Update – Making “SEO” Well Known

2003 saw quite a few different algorithm updates, but most of them were targeted at how Google ranked websites, rather than the way links were being built.  In 2004, though, the “Florida” update brought SEO front and center, making it a well-known industry.

The “Florida” update targeted links again, as well as deceptive on-page practices like keyword stuffing and meta tag stuffing.  Unfortunately for SEOs, keyword stuffing was most commonly used for link building.  A large number of keywords were placed onto the page, and then linked to other pages around the web.

After the Florida update, this practice saw a timely death and SEOs were left to pick up the pieces, figuring out which tactics they would use next.

2004: Brandy Update – Hello, Bad Neighborhoods

In 2004, Google updated their algorithm again, this time focusing on the size of their index, massively increasing it’s size and the number of pages they have available to serve to their searchers.

It also introduced the concept of “link neighborhoods” and started ranking them based on the quality of the sites linking out, and the quality of websites receiving links.  If a “neighborhood” had bad ties in it, the links were traced to find other bad neighbors.

Google was able to group together websites that were all part of linking schemes, devaluing their rankings and the rankings of anyone they happen to link out to.

2005: Nofollow Update – Vouching Required

With the sheer amount of link spam and people buying links from active websites, Google needed a way to determine which links were bought and paid for, and which links weren’t necessarily recommendations to other sites.

The “nofollow” attribute was born.  This allowed webmasters to link out to other websites without actually vouching for them.  It was primarily used for webmasters to stay safe while they were selling “sponsored posts”.

Blog comments was one of the heaviest hit types of links affected by the nofollow attribute.  It also affected websites that were caught selling links without using the nofollow attribute.

2005: Allegra Update – LSI Tweaked

This is one of the first updates that Google didn’t announce, but SEOs and webmasters saw huge ranking fluctuations.  After an extensive research process, smart SEOs were able to figure out (speculate) that the algorithm update affected the “sandbox” and that LSI rankings bad been tweaked.

The main change in this algorithm update was how Google handled what they considered suspicious links.  They began to devalue more and more links that they considered were outside the realm of what “normal” webmasters used.

They actively began penalizing link profiles that were outside the realm of “normal” link profiles.  This helped them figure out new “bad neighborhoods” and made it slightly harder to safely buy (and build) backlinks.

2005: Jagger – Low Quality Links Defined

2005 saw a series of algorithm updates, all targeting low quality links.  There were quite a few changes to how SEOs went about their daily business, and a huge number of ranking fluctuations to show for it.

The main types of links that were targeted were reciprocal links, link farms, and paid links.  It took Google 3 different stages to get it right, but they sent the SEO industry into upheaval.

This is one of the few updates that really targeted paid linking strategies, and used the bad neighborhoods combined with the nofollow attribute to determine who was buying links, and create patterns for the algorithms to figure out new ways to easily identify paid links.

It’s also one of the first times that Google was found to be using manual reviewers to determine who was buying links.  It didn’t end the practice, though.  It just made link buyers smarter about who they purchased from.  The days of blatantly paid link farms were dead.

2009: Vince – Big Brands Move Up

From 2005 to 2009, the algorithms spent more time targeting low quality content and websites as a whole than they did backlinks.  That changed in 2009, when the Vince updated brought big brands into the forefront of the search results.

In an effort to stop low quality spammers from throwing up microsites and plaguing the search results, Google started placing a higher emphasis on big brands and the signals that their websites draw in.

Since it was more logical to trust big brands than it is to trust brand new websites with shady content and linking practices, Jagger showed that big brand signals were here to stay, and SEOs needed to step up their game.

While this algorithm update did target lower quality spam, it also showed SEOs what Google wanted to see, and made buying links easier than ever.  If a site was ranking higher in the search results after this update, it made making investments into buying links from them even easier.

2010: Social Signals – Implementation Begins

After the big brand update, Google needed a new way to help determine which smaller sites are able to be trusted to rank high in the search results.  They were unable to do away with link spamming, even after the Jagger update.

To help them figure out what makes a trustworthy website, they began using social signals, and (arguably) bounce rates and other factors to filter websites in their index.

Twitter and Facebook were both used to help figure out what websites search visitors considered useful.  Most SEOs saw the writing on the wall and knew that this update was coming.  The smart SEOs began working on the content on their websites, to ensure that visitors hung around longer, liked, shared, and tweeted the links to their website.

2011: Overstock.com Penalty – SEO “Outing” Begins

This wasn’t necessarily an algorithm update, but more of Google paying attention to what was happening in the SEO industry and taking note of websites that were blatantly violating their terms of use.

This is also the era that SEOs started outing each other that were competing against them in the search results.  If you were known to be a blatant link spammer, or buyer, your days were numbered.

While this practice has been frowned upon since, and most SEOs will not actively out each other anymore, it did take down a few big names in the industry.  Namely, Overstock.com and JCPenneys were both targets for Google’s manual penalties.

2011: Panda / Farmer – Content Targeted

2011 is the year of the Panda refreshes, and also the introduction of the Farmer algorithm update.  Panda was put in place to target low quality content, thin content, and content farms.  It also hit sites that were ad-heavy pretty hard.

While this isn’t necessarily a link based algorithm update, it did affect the way that people were building backlinks.  During this time, a lot of SEOs were using short, spun content to stuff their links into, and Panda actively targeted pages that contained that type of content.

This meant that a huge number of pages were purged from the index, which saw a dramatic decrease in rankings across the entire industry.

2012: Penguin – Over-optimization Targeted

In April, 2012, Google continued with their zoo-themed algorithm updates by releasing Penguin onto the industry.  This is the first update that worked webspam factors into active index updates, making it harder for link spammers to get away with their tactics for long periods of time.

Google stated that by implementing Penguin into their core algorithm, refreshes would happen more often, and link spam would be devalued on a regular basis.  It primarily affected websites that had over-optimized link profiles.  Keyword stuffing, again, was targeted and devalued.

Over the course of 2012, Penguin would be refreshed three different times.  Each time the algorithm was updated, controls were tightened down and more SEOs saw their rankings decreasing.  It had a major impact on people that were buying links using low quality methods that contained no editorial process.

2012: 39-Pack – Link Scheme Detection

The “39-Pack”, named for it’s 39 different changes to the algorithm, primarily focused on detecting linking schemes and automatically devaluing the websites that are using them.  It also focused on devaluing links from hacked websites, and making improvements to the Penguin signals.

Because a large number of link buyers were purchasing links that were on hacked websites, a large number of SEOs also saw their rankings tank.  Google implemented active tracking of hacked websites by figuring out the malware being used to compromise the sites.

While a lot of SEOs left the industry around this time, the smart ones figured out which websites were costing them rankings, and began to evolve the methods they were using to purchase links, and the websites that they bought links from to ensure that they weren’t hacked or infected with malware.

2012: Link Warnings – Unnatural Links Detected

In March and April of 2012, Google flip-flopped by sending out a mass number of “unnatural link” warnings, and then turning around to say that the warnings may not actually be indicative of a serious problem.

One thing it did do is let webmasters know that they were pushing the boundaries, and learn new ways of building links that didn’t trigger the automated warnings.

By figuring out which types of links (and the velocity of those links) would trigger an unnatural link warning, smart SEOs were able to dial back their link building efforts and focus on what Google considered to be higher quality links.

It also caused SEOs to dial back on the amount of links they were buying.  Instead of buying links from any and every website they could, the unnatural link warnings began the era of spending more money on higher quality links.

2014: Penguin 3.0 – Link Spam Refreshed

It took Google more than a year to continue work on the Penguin algorithm.  In October 2014, they refreshed the index, placing a higher emphasis on the Penguin factors.

They claimed that the update affected less than 1% of search queries around the world, but SEOs saw a far more dramatic decrease in rankings.

They also claimed that the refreshes were based on data only, and had nothing to do with the Penguin algorithm, but trackers in the industry saw signals that pointed to Penguin being a huge factor in the changes.

The timing of the update was unclear, but it did prove that Google tightened their restrictions on the quality and velocity of link building, removing a huge amount of pages and sites from their index that were still trying to blatantly game the system with low quality links in large numbers.

2014: Penguin Everflux – Constant Updates

Before December 2014, Penguin refreshes were rolled out on a regular basis.  Rather than happening whenever Google wanted to refresh their index, the “Everflux” update rolled Penguin into the core algorithm, making refreshes happen automatically.

SEOs in the industry that monitored ranking trackers noticed that ranking fluctuations happened more often, with no rhyme or reason as to what caused them to happen.  This is when it was decided that the link based penalties were going to happen more regularly.

It also gave SEOs a way to bypass the penalties by dialing back their link building efforts, changing how they built links, and the types of sites they bought links from.  Constant refreshes allowed SEOs to monitor the changes almost in real-time.

While Google intended for the Penguin Everflux update to limit the amount of webspam, smart SEOs used the update to push the limits and figure out where the algorithm settled on what was considered “low quality” and what was acceptable as far as building and buying links goes.

2015/2016: Quality Update – More Signals Implemented

This is another update that Google kept under wraps, but trackers showed actually happened.  Based on research done by sites like Moz and other SEO trackers, the update placed a higher emphasis on “quality signals”.

The update became known as the “Phantom 2” update because it was rolled out in silence.  As time went on, SEOs realized that it was actually a part of the core algorithm, and helped Google weed out sites that didn’t have the quality signals to back up their link profiles.

This makes it much harder for low quality sites to buy their way to the top, forcing them to increase the value and quality of the content that they’re using to rank.  It did help content-focused SEOs when it comes to buying backlinks, because a mixture of high quality content + “legitimate” backlinks almost guaranteed high rankings.

As the algorithms progressed, Google has devoted billions of dollars to figuring out how spammers operate, and determine signals that trustworthy websites have.  One thing that hasn’t changed is the fact that they still don’t have their act together, and smart SEOs around the world are spending more money buying links than they are building them.

Buying backlinks is a much more cost effective use of your time, especially when you have the content on your website to back it up.  As you obtain high rankings using paid links, you will need to have the social and quality signals in place if you want to maintain those rankings.

That’s why you can’t hire low-rent freelance writers to turn out dozens of articles everyday, and you can’t purchase links from any and everybody that claims to have the best network of sites.  You have to vet the people you’re purchasing links from, and spend more money on both your links, and your content, if you want to make it in today’s SEO landscape.

Leave a Comment:

1 comments
Add Your Reply