We all hear the term thrown around, and everyone with a basic knowledge of the meta tag is now claiming to be an expert; so here’s a brief history of SEO practice, pay attention, there’s a pop quiz at the end.
It all started in the mid 90’s when webmasters started to optimise their sites for the first search engines. Way back then, all webmasters had to do was submit their URL, and the search engines would use “spiders” to crawl the site. The “spiders” would extract links and return information about the pages so they could be indexed.
As search engines started to grow in popularity, site owners started to see the significance of having their page rank higher in the search results. The actual term ‘Search Engine Optimisation” was probably first coined in 1997 by the Multimedia Marketing Group.
Early search algorithms relied on information to be provided by webmasters using the meta tag; this would often lead to inaccurate, irrelevant and inconsistent searches, as the meta tags didn’t always contain accurate information. Early search engines fell short because they could be easily manipulated; they relied on keyword density, which could easily be adjusted by rogue webmasters.
“Backrub” was a search engine developed by Stanford University grad students Larry Page and Sergey Brin; it would later be re-named Google, and work it’s way into our everyday lives and vocabulary.
Backrub worked on a mathematical algorithm which calculated a site’s PageRank; in simple terms, PageRank estimates the likelihood that a given page would be reached by someone randomly surfing the internet by following links from one page to another. The higher the page rank, the more likely it is that our random surfer would land on the page. This method is less open to manipulation, as it relies on inbound links. Backrub was renamed Google in 1997, a play on the word ‘googol’ which is a name for a really big number (1 followed by one-hundred 0’s.).
1998 rolled around; The Spice Girls said “Goodbye”, Harry Potter first made an appearance on our bookshelves, and Google was born.
Even with the new algorithms, link manipulation continued as thousands of sites were created for the sole purpose of acting as ‘link farms’ which would ‘link spam’ by exchanging, buying and selling links. The Google algorithm was adapted to account for this, and by 2004 most search engines were using undisclosed factors to rank search results.
The accuracy of search results can make or break a search engine, as users will often hold very little loyalty. If a search engine falls prey to unscrupulous webmasters pushing irrelevant sites, then users can quite easily search elsewhere. By not disclosing their algorithms and the factors involved in ranking websites, search engines like Google, Yahoo, and Bing have prevailed and retained a loyal following.
So if search engines don’t disclose their algorithms, how is search engine optimisation possible?
SEO practitioners can study patents held by search engines to better determine what factors are involved in their algorithms. There are also many factors involved on-site which can influence search results, the key is in the content. Key words are still highly relevant for SEO, as are inbound links, clear URLs, and accurate site maps. Although you could get a friend to do it for $50, it pays to hire a professional.
The web is constantly changing, and so SEO practitioners must adapt. With the introduction of social media and the expansion of blogging, search engines must allow for fresh content to make its way into rankings. While some have hailed the introduction of personalised search as the death of the PageRank, it could be equally argued that SEO is even more important now, as search results are more targeted and more relevant to individuals.
And there you have it, a short history of search engine optimisation. I lied about the pop quiz, but I hope you paid attention anyway.