I have been involved in the development and optimisation of websites for the past 20 years, and as with most things in that time, much has changed. In the beginning, there were no real go to search engines, and indexing sites where limited. Then came the likes of Bulldog, Yahoo, ask Jeeves and eventually Google.
At the start of this shift in the way we interact with the internet, online tools were relatively simple, and the job of marketing websites was like the ‘wild west’, there were no real rules, and furthermore, there wasn’t really anyone to enforce them. It was the wild west analogy gave rise to the terms ‘White hat’ and ‘Black hat’ marketing technique.
Let’s start with the Black hats. Many black hat techniques were developed simply by testing search engine response to different parameters such as the number of times key words appear on a page or the use of hidden text (e.g. white text on a white background). The black hat marketers tore their way through the internet reducing page functionality and readability with their ‘quick fix’ solutions. This was unsustainable, and left the search engines struggling to deliver the right content to their users.
White hat marketers in the meantime where building sites with great content, curated imagery and delivering a quality product to their customers.
In order to remedy this free for all mess, and in the absence of any kind of enforcement body, the search engines, began to enforce the idea of good citizenship across the internet. Having given well publicised warnings about specific black hat techniques, the enforcement began. Search engine spiders began re-indexing and scoring sites with a new set of parameters, specifically designed to penalise bad citizen sites, dropping them down the search engine results pages (SERPS). This was a real shift in the way search engines operated. Instead of occasional minor changes to the core algorithm, search engine companies, and in particular google became more and more obsessive about relevance and quality, and today update their algorithms hundreds of times each year. Only the big changes get given names.
In 2010 Caffeine was launched by Google. This was a change to the SERP refresh regularity, and instead of updating once every few weeks or even months, the SERP’s began updating several times a day.
Panda (first released in 2011) was a significant change to the way Google indexed and generated SERPs. Panda was designed to reign fire on the good old black hat cowboys, weeding out, identifying and penalising their shoddy work. This update was designed to focus on site quality, and would include the types of metrics identified on remux.net as indications of quality and therefore legitimacy. This is the bed rock measure (correct page and site structure) but by no means the only measure of a site or page. Content itself is critical. A further aspect of Panda was content quality. Usefully enough, Google released a list of considerations on their blog which site owners should look at when creating a page or article:
These are not the ranking signals Google use, and quite rightly Google would never tell us what they are to avoid getting played by Black hats. But it’s a reasonable assumption that some ranking signals are buried in amongst these points. Again, we see the search engines promoting good citizenship, encouraging good quality content from site owners and authors.
Googles penguin update was released in 2012 and was designed to focus on link structure. Links into your website are seen by search engines as a sign of credibility. In a way, you could think of inbound links as votes from other sites to say your site is knowledgeable. Links are validations.
The problem was that unscrupulous site owners and SEO service providers created inbound links from link farms, or by posting cross links from forums or just about anywhere that would let them type in an active URL. Penguin was Googles response to this.
By looking for unnatural link structures e.g. links coming in from low content quality sites or sites that have no relation to your offer, the search engines can build a picture of the types of inbound links you use. An example could be – you are a hotel and inbound links arrive from Trip Advisor – That’s just great. But if you are a hotel and you have inbound links from unrelated sites or a site which just contains lists to random sites – that’s bad. Sometimes, those links are nothing to do with the site owner, so you can use google webmaster tools for example to identify the inbound links that Google terms ‘Toxic Links’. They provide you with a full list of sites generating toxic links, and all you need to do is ‘disavow’ those sites. Here’s the disavow tool
Googles Hummingbird update was released in 2013 and was perhaps the largest change to Googles search algorithm to date. Although functionally very similar, hummingbird was designed to deal with more real-world search queries such as ‘How do I’, ‘where can I buy’, ‘find me a local’ etc. These types of search query are particularly relevant when users start talking to google using voice. This is obviously necessary for other Google products and services such as OK-Google, Google Home and Google Assistant. Intelligent assistants and A.I. technologies do appear to be the route most search engine providers want to take, so it’s reasonable to assume the same types of ranking signals relevant to Hummingbird will be relevant to the other big players in that space.
In 2015 a new update was released in response to the changing technologies we use to search the internet. Across the many different content types available on the web, user interaction happens on different devices. For example (according to Comscore) instant messaging, social media, games and Job search are accessed in general using mobile devices while other categories such as Banking remain primarily desktop activities. And the mobile trend continues to grow as users feel more secure with their platform choice. Googles mobile update, known informally as ‘Mobilegeddon’ was designed to let mobile first sites (those designed to work naturally on a mobile device) float up the SERPs to deliver a better user experience to mobile users.
Again here, Google offers a free tool to test how mobile friendly your site is. And a deeper report of errors can be found in google webmaster tools.
SEO is really a reference to best practice. The only art is keeping up to date with the ever-evolving definition of ‘best practice’. But fundamentally, having a site that is well constructed and contains the best information will get you headed in the right direction.
In general terms there are no shortcuts to SEO. It’s about doing things right. Choosing the right platform, information architecture, user experience, user interface and content strategy - offering the best, most useful content out there.
As mentioned at the start of the page, these are not the only updates Google made since 2010, and there will be many more to come. Not all are announced, but many do get noticed by SEO/SEM professionals and if you are interested, MOZ keeps a good tracking list on their site.