A ranking algorithm in the context of search engines is an automated method that search engines use to rank websites and determine where they should appear within the search results.
Above is the definition of a ranking algorithm from Search Engine Land, a leading source for news and information about SEO.
This article will define major Google updates in history and explain their overall effect on how website rankings work today.
The first recorded update to Google’s ranking algorithm was called ‘Caffeine.’ Caffeine was released on June 11, 2009 and its main purpose was to speed up indexing and querying speeds by reducing reliance on recrawls towards freshness of content. This update was not expected but still hugely beneficial to the growth of Google when it comes to internet searches per day. At this time, it’s estimated that there are about 40,000 searches per second which is why speed in the rankings are very important.
One major update that has had a huge effect on Google results is called ‘Panda.’ This update was released February 24, 2011. It aimed to reduce sites with poor quality content from ranking high in the search results. The goal of this update was to return more accurate results. And prevent spammy websites such as link farms or article directories from showing up first.
Sites with relevant information and original content that were not making money off ad revenue saw their ranks increase while sites that produced low-quality content did see a decline in rank. Because of this algorithm, many ecommerce companies (businesses focusing largely on generating revenue through ad clicks) who were used to tricking Google into thinking they were creating quality content saw a decline in their ranks.
This next update was called ‘Penguin.’ Penguin was rolled out on April 24, 2012. It had an effect on spammy backlinks which are webpages linking to your site that are not relevant or helpful. Not all links are considered equal. Some carry more weight than others when it comes to the strength of your website’s rank. For example, having a link from Wikipedia is generally considered stronger. Than having a link could be considered as less valuable due to the fact that Wikipedia contains highly-reputable information. This algorithm aims to find sites with low-quality backlinks. And devalue content so that other websites deemed to be more valuable can rise higher in the ranks.
The final update discussed in this article is called ‘Hummingbird.’ This update was released on August 20, 2013. It focuses more on semantic search. Which is a search that returns results based on the meaning of your words or phrases instead of just keywords. This update gives Google a deeper understanding of each query. Allows for results to be returned faster since they are able to process large quantities of information at once. It’s also used to recognize long-tail keywords – keywords that have multiple terms within them such as “best places to live in Chicago.”
In conclusion. Google ranking algorithms have evolved from being much focused on achieving speed towards processing more complex queries with better results. Google has done an amazing job at becoming smarter and faster when it comes to ranking websites. And providing relevant search results.