“You have to know the past to understand the present.”
– Carl Sagan
Little is known of that obscure era that took place from the 1950s to the end of the 1980s. Some findings point out the existence of a pre-internet network, called ARPANET. ‘Network Historians’ speak of mysterious computer networks like Telenet, NPL, and Cyclades.
There was no Search Engine Optimization because there was no search engine to optimize.
These were some dark times.
If there was a Stone Age, then there had to be a human who took a stone from the ground for the first time and turned it into a human tool. That someone in our case was Sir Timothy John Berners-Lee. He was the inventor of the World Wide Web (WWW).
On August 6, 1991, and thanks to Sir Timothy, the world’s first website went live, running on a computer at CERN.
The first search engine ever, called Archie, had already been created. Since there was no WWW, Archie was exclusively indexing FTP (File Transfer Protocol) Archives, to locate a specific file and transfer it from one computer system to another. Due to its limited capabilities, the content of each site wasn’t available. Only the listings were.
In the next year, there was yet another search engine, named Veronica. Veronica was responsible for updating the database of the names of every menu item on the Gopher servers. Gopher is the effective predecessor using the TCP/IP protocol of the World Wide Web as we know it today, for HTTP wasn’t yet the dominant protocol.
Then came another search engine, called Jughead, which had many similarities to Veronica, except Jughead was searching one server at a time. During these archaic times, there was also the Wanderer, the first web crawler that could measure the size of the WWW.
If Archie was the first search engine that existed, then Aliweb is considered to be the first Web Search Engine. Aliweb came out in 1993. It allowed users to submit the pages they wanted to index. In addition, they could write a custom description, which Aliweb could crawl. This ability to crawl was the first ability for Search Engine Optimization possible – 30 years ago. In effect, it empowered webmasters to define the terms that would lead users to their pages. However, not too many of them submitted their sites, and Aliweb stayed more or less unutilized.
During the same year, JumpStation, another search engine, launched. JumpStation behaved and appeared the way today’s web search engines do. It used document titles and headings to index the web pages found using a simple linear search but didn’t provide any ranking of results.
In every search engine of this era, it was practically very hard to find anything, unless the user was typing in their query the exact title of what they were looking for.
The only SEO technique possible in the history of SEO, if it could be considered as such, was that of Aliweb’s meta description.
The year is 1994. “Jerry and David’s Guide to the World Wide Web” was created. Maybe you’ve never heard of it, but you have heard of the name that the company was renamed to, a few months later: “Yahoo!”. Now that’s a bit more customer-friendly brand name.
What differentiated Yahoo! from everyone else, was that Yahoo! was the first search engine that didn’t organize the directory of other web pages as a searchable index, but rather, it was organizing the index in a hierarchy. This changed everything in the search engine landscape.
Yahoo! added informational sites for free, but they expanded to include commercial sites as well, for a price. In the next few years, Yahoo!’s presence wasn’t limited to ‘Yahoo! Search’. What’s more, it built many different properties and created a variety of services, such as ‘Yahoo! Mail’, ‘Yahoo! News’, and ‘Yahoo! Finance’.
The main antagonist of Yahoo! at the time was AltaVista.
AltaVista had found the recipe for the precious metal of the era and entered the Bronze Age as well. Like mixing tin, arsenic, and copper to create bronze, AltaVista mixed different innovative elements: It offered unlimited bandwidth and allowed natural language queries. It also maintained an ‘Add URL’ page that allowed webmasters to ensure that key pages from their site were listed quicker than ever within the index; an index that was fully searchable and crawlable, from their crawler called Scooter. AltaVista was one of the web’s top destinations and was offering new features and search tips daily.
Enter 1996. Two students from Stanford University were working on Backrub, the first search engine to utilize backlinks. This would have a huge impact on SEO, as now the reliability of a site would come from how many people linked to that site, and how trustworthy the linking sites were. Any mention of a website would count as a vote of confidence towards the mentioned site.
In the same year, AskJeeves, what today we know today as Ask.com, entered the search engine arena. The original idea behind Ask Jeeves was to allow users to get answers to questions asked every day, using natural language. Furthermore, it was created with another innovative characteristic in mind: to rank links by popularity. AskJeeves also used clustering to organize sites by subject-specific popularity.
In 1997, the Excite search engine was created and became the first search engine to provide only crawler-based listings.
Remember Backrub we mentioned earlier? It’s now called Google. Not exactly the Big Tech company we all know today. More like a Google that was struggling to keep up with the competition. But Google had an innovative idea: to sell search terms. This move had a significant effect on the search engine business. A bright future for the search engine world was about to come.
But then, the dot-com bubble burst.
It was a heavy blow to the search engine industry. But it survived – only to become stronger.
Like with the real Middle Ages, there was War. Our SEO Middle Ages, and what came before them, was a thing of digital violence. The SEO warfare was ON.
The search engine landscape was a free-for-all arena where anything went.
Major algorithm updates would take several months to complete, which only enforced illicit tactics from the webmasters, also known as Black-hat SEO.
Illegitimate practices and spam -a lot of it- occurred in most parts of the internet. That form was and still is, known as spamdexing. Spamdexing was a way to try and manipulate a search engine’s understanding of a category so that the webpage can find favorable rankings in the search engines.
Other Black-hat SEO tactics include:
https://www.wordstream.com/black-hat-seo
But battles were not only fought between webmasters and their pages.
The great monarchs of the search engine land, Yahoo!, Google, Microsoft’s MSN Search, AltaVista, and others, were in an ongoing digital conflict.
AltaVista soon lost ground to Google and was purchased by Yahoo!, in 2003. Alas, their forces combined could not endure what Google had become and Yahoo! lost a big proportion of the market share.
Google, as a verb, was added to the Oxford English Dictionary and the eleventh edition of the Merriam-Webster Collegiate Dictionary. Google was now The Ruler. And an official word in our dictionaries.
As it usually goes with every new ruler, Google had big plans for its people. The name of this new plan: Florida Update.
Florida Update was the first major Google algorithm update – the update that changed SEO forever. The purpose of the algorithm was to fight against keyword stuffing, so it penalized any website that was practicing such actions.
Before the Florida update, many retailers relied almost entirely on affiliates to drive traffic to their websites. After the update, a lot of major and minor retailers saw significantly reduced traffic.
Many of the sites that previously populated the top 100 were cleared out. Webmasters reacted to that change with a lot of mumbling and some serious SEO. But mostly mumbling. With time, more and more site owners started focusing on their webpages and started making them of higher quality.
G00gle -The King- had ordered some music, and now everyone was dancing to the tune.
We are well into Web 2.0.
A new model for information exchange has risen, which changed the internet experience as we knew it. Sites on the web stopped being static, and information wasn’t flowing in just one direction, ‘Website to Reader.’
Internet users got access to faster internet speeds, and the interactivity of websites made a huge leap forward: The User could now be the Creator. Many new sites, such as Wikipedia, YouTube, Myspace, and Blogger, quickly became popular and introduced a new concept of user-submitted content.
In 2007 Google changed the Search Engine Results Page (SERP) for good. Until then, the Google results page was only listing 10 blue links. Now, with ‘Universal Search,’ Google results were listing videos, images, maps, and additional media, above, to the right, and amongst the organic search results.
The results were no more exclusively keyword-dependent. Several factors, such as location, search history, and cookies, would drastically affect the results.
The addition of these factors created a whole new chapter for SEO, as now, businesses and marketers had more formats than just words to get to the user. This user-focused approach to SEO helped lay the foundation for a more captivating and personalized web. Webmasters began optimizing new content media to increase exposure.
Meanwhile, Spamdexing continued, but on a marginal level. Although there were many cases of Google bombing, the search engine reality was very different.
In 2008, Google Suggest was launched, which was displaying suggested search options based on the universal internet trends and historical data of the user.
New user insights from keyword research tools, Google Trends, and Google Analytics were also major additions that brought digital marketers a new (big) box of tools.
SEO had become an industry of its own.
Our history of SEO continues in the 2010s. At the beginning of the decade, Google released the Panda Farmer Update. This update punished websites that had thin, non-original, and low-quality content, like content farms and scraper sites. The new release forced SEO to focus on higher-quality content.
In 2012, Google continued to keep webmasters on their toes. There was another update, the Penguin Update that penalized everyone that was buying links or obtaining them through link networks that were designed to boost search rankings. Google kept on leading the crusade against low-quality links, keyword stuffing, and web spam.
A year later, Google released another update. The Hummingbird Update was the largest algorithm update until then. The update gave Google Search the ability to analyze the intent behind a query, rather than just the language itself. Hummingbird placed greater emphasis on natural language queries, considering context and meaning over individual keywords.
The most significant change to Google’s algorithm in years forced a change of habits for webmasters. They had to optimize their sites with natural writing rather than forced keywords. They also had to make effective use of technical web development for on-site navigation.
Around that time in the history of SEO, Google’s Knowledge Graph rolled out to include panels in SERPs. Those panels presented additional information to the readers. The user could now get immediate answers without the need to dig through content.
Google also enhanced localized SEO, and results were listed directly in SERPs. All local information was organized, and businesses now had more advertising options than ever before.
Here’s a fact you probably know due to common sense.
The time users spend on social media is ever-growing.
Google’s algorithm is famously secretly guarded and was stating that social media isn’t a direct ranking factor.
However, according to Searchmetrics, this was not the case.
“The correlation between social signals and ranking position is extremely high, and the number of social signals per landing page has remained constant when compared to with the values from last year’s whitepaper. … The top-ranked websites in Google’s rankings displays vastly more social signals than all other pages … This is primarily due to the overlap between brand websites performing strongly in social networks and being allocated top positions by Google.” – 2016 Rebooting Ranking Factors White Paper
This simply means that, indeed, social media affect SEO.
Content that was shared throughout the web and social media created valuable backlinks and engagement that built authority. These trends lead to the fast-paced, personalized, and more engaging web we know today.
Mobile devices are also gaining ground over desktops and laptops every day.
2016 is the year that Mobile internet usage surpassed desktop usage in the US.
Google has become the ruler not only because it’s setting trends. It is also (successfully) following them to stay relevant and competitive.
So, naturally, Google wanted in the mobile era, and it came with a bang: ‘Mobilegeddon.’
Mobilegeddon was the unofficial name for the update. Its purpose was to benefit mobile-friendly pages in mobile search results and push things forward.
Now, SEO had yet another mission: to be mobile-friendly.
Through all this time, Panda and Penguin version updates kept rolling out. Every new version did more or less the same job: penalizing any site that wasn’t complying with the changes. Here are some of the most notable ones.
In 2017, Google released a search algorithm update. The update, named Fred, punished sites with low-quality backlinks, and anyone that was prioritizing monetization over user experience.
In 2019, Google announced the BERT (Bidirectional Encoder Representations from Transformers). Google itself called it the biggest change to its search engine in the past 5 years, as the update impacted both search rankings and featured snippets.
In 2021, Google announced an algorithm update aimed at identifying and nullifying spammy links. After this update, any websites taking part in link spam tactics with their sponsored, guest, and affiliate content were forced to find other kinds of tactics.
And in 2022, Google rolled out the Link Spam Update. The purpose of this particular update was to further neutralize the impact of unnatural links on search results. For the first time, Google used its Google’s spam-detection algorithm called SpamBrain. Needless to say, since then, link-building has become more difficult -and more genuine- than ever.
Today, SEO is more multidimensional than ever. And it seems like there is no going back.
And with impactful updates rolling out every few months, can you be certain that what you do on your SEO today will hold up tomorrow?
In this life, there are no certainties but with these 5 steps you are as close to creating a Google-update proof strategy as it gets:
So keep calm, and keep on Optimizing. The history of SEO is still in the making.
I write for GrowthRocks, one of the top growth hacking agencies. For some mysterious reason, I write on the internet yet I’m not a vegan, I don’t do yoga and I don’t drink smoothies.
A list with some of the best performance marketing agencies that are experts in PPC,…
Page load speed, a compelling headline, social proof, and a clear call to action. All…
AI knowledge or prompt engineering alone does not equip teams to lead in the AI…
Democratizing Innovation is the potential ability of users to develop what they need themselves. AI…
B2B vs B2C Marketing: are they that different? In some ways, yes. From price sensitivity…
Are you looking to work with one of the top B2B marketing agencies? Here's a…
View Comments
Cool! I like the Chronological order. A great read about SEO.
Thank you so much for your feedback! We're glad to hear that you enjoyed the article on SEO in chronological order. We strive to provide valuable and informative content, so it's always encouraging to receive positive comments like yours. If you have any further questions or if there's any specific topic you'd like us to cover, please don't hesitate to let us know. Happy reading!