Guest contributor Ben Sibley looks back at four major Google updates from years past, and explains what they can tell you about the company’s plans for the future.


One thing every SEO professional understands is how to adapt to change. Google updates their algorithm constantly and is making more adjustments more frequently than ever before. This year has seen more documented updates and feature additions than any other year.

Keeping up can be a daunting task, but do you really need to give chase? There’s no way to predict with precision the intent and timing of Google’s next algorithm update, but you can look to the past for clues.

By understanding the history of Google’s major algorithm updates you can better understand what the future will bring, and prepare in advance. This is a brief history of Google’s highest-impact updates.

November 2003: Florida

In 2003, SEO teams were not accustomed to game-changing updates and the possibility of dramatic changes to rankings virtually over-night. Florida changed all of that.

One of the main objectives of the update was to reduce the visibility of commercial results in the SERPs and make informational webpages more visible. Google intended to make commercial sites rely more on pay-per-click (PPC) advertising instead of organic SEO. It became harder for commercial results to rank for anything besides queries with clear commercial intent (ex. “buy womens shoes”). Ecommerce sites were hit especially hard.

While they were forced into utilizing AdWords more, smart ecommerce sites began blogs and started creating informational content to increase their visibility as well. The Florida update made it even more important for commercial sites to have a blog. With the popularization of inbound marketing in recent years, the benefits of blogging have become increasingly obvious for all business owners.

Google also made a few other changes designed to increase quality for searchers. They lowered the visibility of affiliate portals that served as middlemen and increased their use of Latent Semantic Indexing to better understand the meaning of the text on a webpage.

Google also continued their battle on webspam by targeting and devaluing hidden text, keyword stuffing, and reciprocal linking. In other words, they eliminated more tricks and forced webmasters to provide more value for their rankings.

January 2004: Austin

Before the dust settled from Florida, Google surprised webmasters with the Austin update. With Austin, Google implemented the ideas outlined in the Hilltop white papers.

The algorithm update increased the importance of links from relevant webpages and the relationship between tiered links. For example, if webpage A links to webpage B and webpage B links to webpage C, then A and C must have some thematic relationship.

This added a more topic-sensitive aspect to the imperfect PageRank system. Low page-rank “deep pages” began to rank above high page-rank pages in the SERPs. In short, relevance became more important and general authority became less significant.

Yet again, Google made the easy linkspam tactics like paid links less effective.

October 2005: Jagger

Despite past attempts, there were still loopholes being exploited and spam techniques continued to work. With Jagger, Google began using historical data for links and webpages, and implemented TrustRank.

Getting links indexed no longer meant that they passed value to your site. Jagger added a filter that delays the effects of any links for a period of time. Essentially, the age of a backlink needed to qualify a time standard before being counted. Basically, Google made it harder to qualify as a site that receives visibility in the SERPs.

Following new philosophies put in place by the TrustRank papers, Google began using human-reviewed sites as “seed sites” to decide how trusted a website would be. As the TrustRank whitepapers noted, “Good pages rarely link to bad ones.” A trusted site like Wikipedia, for instance, is very unlikely to link to a spammy, low-quality website.

This forced webmasters to acquire better links and gave those doing honest work a way to rank above sites using spam links.

February 2011: Panda

The Panda update took place over a year ago, but the name continues to be used for subsequent updates taking place now. Just weeks ago Google released Panda 3.9 and will continue with more in the future.

While previous updates focused primarily on handling link schemes, Panda sought to improve the quality of results based on user experience. Google used data such as the time-on-site of visitors and click-through-rates in the SERPs to remove low-quality results.

Websites such as AssociatedContent.com and Hubpages.com lost over 80 percent of their rankings after Panda. Most of the biggest losers after the update had a few things in common. They had large volumes of ad space relative to content, and that content was primarily non-expert, user-generated content.

By utilizing user-experience data, Google was able to reduce the visibility of sites like this that exist only to get search traffic and generate ad revenue. Once more, Google pushed higher-quality sites to the top of the SERPs and forced webmasters to provide more value to searchers.

Takeaways

It’s more clear than ever that Google can and will find out what searchers like and deliver it to them. The way to perform best in Google is to find out how you can perform best for your visitors. And if you don’t know how to do that, Google will find someone else who does.