Once in a lifetime, we all have heard someone chatting over outstanding developments taking place before we were born and enduring through the current days.
Whether some might feel astonished to hear that something has been on since your birth, others would feel baffled to hear how aged they turned up.
In truth, I enshrine myself into the first class of individuals luring shenanigans’ story tails from between the end of the nineties and the early stages of the new Millenium. To put it into context, that’s that time when people still used to gather in large crowds. That innocent era when youngsters struggled to start off a mobile chat and sorted it out with the T9 magical assistance.
It was also the time when Jennifer Lopez‘s fashion style smashed the walkway and plugged Google’s fearsome tech escalation.
Though the history of Google Search is not shenanigans, one of those story tails I do appreciate is the one unlocking the mechanisms at the threshold of the search engine throughout the last two decades.
Hence, in this post, I am going to collect all the bits and pieces since the dawn of the newborn algorithms at the roots of Google’s search engine to ultimately draw a few assumptions on the future of organic search.
Also, you’re going to learn how come JLO intertwines with Google’s overall growth as a search engine.
How Google Search Evolved Through the Years
📃 PageRank – 1996
🌅 Google Images – 2000
🌎 Universal Search – 2007
🐼 Google Panda – 2011
🐧 Google Penguin – 2012
🐥 Hummingbird – 2013
🙋🏻♂️ People Also Ask – 2015
🧠 Rankbrain – 2015
👅 BERT – 2019
🤰🏻 MUM – 2021
PageRank
Officially launched in 1996, the algorithm was designed to analyze the quantity and the quality of links pointing to websites in a bid to understand the organic ranking they deserve on the SERPs.
Pagerank boils down to a yet obsolete mechanism to gauge rankings, as the underlying assumption is that more important websites are likely to receive more links from other websites.
To expand a bit more on the algorithmic foundations, PageRank is a link analysis algorithm and it assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of “measuring” its relative importance within the set.
The overall usage of Pagerank is today fairly limited as Google would rather fire more semantic-related search algorithms to sort search results.
As a historical artifact, Pagerank expired in 2019 and retains an authoritative first result spot on the SERP as a Wikipedia source
💡BONUS
Due to the prominence of PageRank we are entitled to investigate how meta descriptions impact rankings. You can do this using Python and leveraging Sentence Transformers as a cutting-edge NLP library.
The Dawn of the Disruption
In the early 2000s, an escalation of disrupting events flicked Google to the lead of the search engines market batch whilst the company laid the foundations for skyrocketing growth which would have plunged its reasons into a wide diversification strategy.
It didn’t take long before the Motor View-based company turned the tide in the competition race by tackling the basic algorithms gearing up Altavista, Yahoo and Lycos. Despite they were polishing off every search query by levelling the lower searchers’ expectations of that time, Google seemed to have an edge in providing an enhanced search experience.
Such assumption came true as soon as the Californian company released Google Images following a splashy anectodal event.
Google Images
What the heck does Jennifer Lopez has to do with Google climbing the ladder of innovation?
Nothing. Yet this is it’s not just about click baiting but it conceals a fundament of truth.
In fact, Jennifer Lopez attended the 2000 Grammy Awards in a flamboyant print dress by Versace. The gown she was classily wearing became an instant fashion legend as well as likely one of the first virality instances on the Web.
As the event prompted an unfathomable ton of queries on Google, it genuinely left users high and dry as the search results were not enabled to return any kind of visual excerpts.
Hence, Google acted as a smart first-mover on the market and buckled up to switch its search algorithms to the interpretation of visual elements.
It goes without saying that Google showed off extraordinary business acumen for that time, as in a matter of a few months, the search engine advanced its market share and started tallying its brand positioning against a cutting-edge and innovative tech company.
Google CSR Stands
Following the shocking events in September 2001, Google put together a number of practices and policies (Corporate Social Responsibility) as a corporation intended to have a positive influence on the world.
In this regard, Google launched a tenacious campaign to crack down on the spread of misinformation.
To serve this purpose, it’s interesting to notice that for the first time the company approached cluster analysis tasks aimed at grouping authoritative articles by topic.
This is a crucial turning point at Google as it marked the kick-off of a multitude of unsupervised machine learning training models which will see the light henceforth.
Help in Time of Crisis
Enshrined within Google’s CSR engagement, the company was somehow forced to devise additional SERP features dealing with life and death matters.
To put this into context, following the dramatic hurricane Katrina, Google responded by providing temporary SERP features and awarding YMYL websites supplying vital bits of information concerning natural catastrophes with the potential to harm users’ ordinary life.
Since 2005, Google pledged to provide trust and authoritative resources revolving around health, financial stability and safety of people in the above-the-fold spot of the search results page.
👀 Do you recall anything about recent developments? 👀
Universal Search
Rallying back on search algorithm developments, the Californian company progressed its search engine evolution following the footprints of another outstanding release.
In 2007, Google launched Universal Search with the ultimate goal to put together news, images, videos and more.
In SEO words, the definition of this search algorithm might revolve around the following gist:
“Universal search is the ability to search all content across multiple databases through a single search box. Although content sources might reside in different locations, such as a different index for specific types or formats of content, they appear in a single, integrated set of search results.”
This might sound old-fashioned for the time being, but in 2007 the ability to browse a wide range of “subfolders” on Google was something coming out of the blue.
The trailblazing move enabled users to browse UGC from additional boxes on the top bar of the SERP

Organic discovery took an unexpected leapfrog at that time as unclear search intent was being parsed and rehearsed in a loop. This helped Google a lot to hone in on search intent based on wider data samples.
Interestingly, Universal Search might resonate with the most recent MUM algorithm (which we’ll discuss later). In fact, Universal Search brought to the search results a brand-new wave of media formats rejoicing the traditional blue links. Naturally, the move resulted in an overall enrichment of the SERPs stressing responsiveness and UX.
A radical overturn that changed the search experience forever.

Google Panda
After raising the bar on security with the mass introduction of the HTTPS protocol, Google released one of the most known search algorithm updates within the SEO industry revolving around the content quality threshold.
Launched in 2011, Google Panda is an algorithm update that rewarded higher-quality sites and downgraded the presence of lower-quality sites in their search results. The main features of the algorithm are:
- Downgrading duplicate content
- Dongrading thin content
- Downgrading low-quality UGC.
In a nutshell, Google Panda taught webmasters and SEOs that more is not necessarily better as content should genuinely serve an underlying purpose.
Google Penguin
Tallying against the quality threshold targeted by Google Panda, the search engine released another major search algorithm shortly after.
Launched in 2012, Google Penguin is an algorithm update aimed at downgrading sites that engaged in manipulative link schemes and keyword stuffing.
The main features of the algorithm are:
- Combat link manipulation
- Combat PBNs and link farms
In short, Google Penguin was launched by Google in a bid to downgrade websites housing low-quality links and harvesting sleazy link farms and suspicious PBNs. As a result, webmasters and SEOs started to meddle in their site’s link profile and make more assumptions on the extent of building a link network
Hummingbird, the rise of entity search
Moving out of the search quality threshold of google search, the next algorithm update represents a milestone for semantic search and how the search engine tallied its evolution against natural language processes (NLP) to better understand conversational queries.
Launched in 2013, Hummingbird is an algorithm released by Google to return better search results emphasizing the meaning of search queries over individual keywords. This means that the search engine was equipped with a cutting-edge algorithmic skill set that would have turned the search engine into a scratchy form of a human assistant.
Against the odds, though, Hummingbird was released as an underdog with SEOs struggling to fully grasp the gist of the algorithm and its potential impact on search.
In a nutshell, Hummingbird succeeded in expanding Google’s semantic search capabilities.
People Also Ask
As time went by, search patterns evolved with users getting accustomed to compiling more complex and colloquial queries.
On the wavelength of the recent advancements in NLP and NLU (Natural Language Understanding), as of 2015 Google increasingly kicked off frequent asked questions embossed into featured snippets.

Notwithstanding you can still sense a bit of fuss in the search intent definition, as testified by the above example.
Henceforth, Google would focus its efforts on developing powerful algorithm fixes aimed at minimising outliers around the search intent detection.
Rankbrain
In the same year, Google dived into AI by engineering an AI-based system empowering the search engine to deliver search results in compliance with the meaning behind the search query.
RankBrain is an AI-based algorithm that enabled Google to return relevant pages regardless of containing the exact words used in a search. As a result, Google was henceforth able to collect a larger proportion of “never-seen-before” searches.
Rankbrain taught webmasters and SEOs that Google was starting to leverage semantic search to connect web pages to concepts, rather than strings. This means that Google made an incredible step forward in clearing off the average search intent behind a query.
Similarly with the launch of Hummingbird, though, SEOs were slow in finding a grip on favourable optimization methods resonating with the rise of semantic search, which opened room for experimentation on a new approach to implement structured data.
You don’t optimize for RankBrain says @methode #smx
— Barry Schwartz (@rustybrick) June 23, 2016
Except drawing to the common sense conclusion suggesting to always optimizing for users first.
@methode @JohnMu Would both of you still agree that you cannot optimize for Rank Brain, and that what you discussed in 6/2016 is still the case today — that there is no way to optimize for Rank Brain? https://t.co/qQg4tGPRhS
— BrianHarnish (@BrianHarnish) September 26, 2019
That Neural Machine Learning Escalation
As we approach the second half of the 2010s, users experienced a prosperous age for search.
With Hummingbird and Rankbrain covering a crucial spot within Google’s search algorithm arsenal, a few advancements in machine learning and virtual reality (VR) cemented Google’s market share as a search engine, and let the company join the competition in AI.
In fact, Google started devising proficient strategies to kick off advertising automation with programmatic ads as well as experimenting with voice search.
Despite the efforts into such a new diversification strategy, Google didn’t bring additional value to the search experience. Nonetheless, the company worked around the clock to lay the foundations for engineering even powerful algorithms to apply to the realm of the Search.
BERT
After 4 years of AI experiments resulting in an awful lot of smoking guns, the company decided that it was time to revamp the latest workaround on Google Search.
Taking advantage of the part of recent projects on AI, Google doubled down on entity search and issued an impactful search algorithm that sealed off the progress trailed by Hummingbird and Rankbrain.
Launched in 2019, BERT is a search algorithm that helps better understand the nuances and context of words in searches, and better match those queries with more relevant results.
Google shifted from a unidirectional understanding of concepts to bidirectional, so that words in a query can be filled in different order without affecting search results.
Let’s make an example.
Query: [the car is red]
Before BERT the “red” info was not collected and referred to as “car” but now you can type either “red is the car” or “car is red” and you will receive the same results as the model is trained to understand a specific context.

In short, this algorithm update helped Google better understand things over strings and provided a massive prompt to SEOs to switch the traditional keyword research automation to entity research.
AI and Multimodal Search
But Google didn’t ditch experiments in AI and deep learning.
Not at all.
In the four years that separated the release of Hummingbird and Rankbrain from the launch of BERT, Google garnered unlimited data from the processing of inner machine learning models. It was all part of a long training session that would have brought to the light a new wave of search algorithms fully empowered by AI machine learning models.
The main feature of the new generation of search algorithms is multimodality, meaning they are enabled to retrieve multiple information from multiple sources at once.
This means that the search engine is becoming more comprehensive and human-like in retrieving information from unstructured data to yield annotations and parse search queries regardless of their format.
MUM
In May 2021, Google introduced MUM (Multitask Unified Model) as a multimodal AI-powered search algorithm delivering an unlimited range of machine learning-driven SERP features.
By unlocking the ability to extract information from multiple sources, MUM is in the driving seat to yield the switch to semantic search along the search evolution.
Combining more multitasking capabilities than BERT, MUM is a multimodal search algorithm, meaning it “understands” different content formats like text, images, videos and audios at once. This gives it the power to gain information from multiple modalities, as well as respond.
An important note on MUM is that the model not only understands content but can produce it. So rather than passively sending users a result, it can facilitate the collection of data from multiple sources and provide the feedback (page, voice, etc.) itself.
Amazing, right?
As a general rule of thumb, when individuals see the rise of a new technology they tend to over welcome the rollout just because it’s being perceived as new and disruptive.
The bad news is that it is all related to a confirmed human bias claiming new releases are oftentimes perceived as the best choice of any given world, regardless of what the news is all about.
Whether you might welcome AI rewriting content and auto parsing information on Google, in August 2021 Google started automating meta tags rewriting at its own convenience whilst automated SEO copywriting began a thing.
Noticeably, all of this fuss brought an unfathomable wave of dissatisfaction in the industry as it yanked out SEO’s crafted meta tags as they were being replaced with a presumed better set of AI-powered meta tags. Needless to say, this left SEOs high and dry.
Did Google succeed in better satisfying users’ needs?
As always happens with the release of brand-new technology, the first setup showcased a few teething troubles as SEOs chased down Google on the lookout for further clearance on how to handle the massive title tag rewriting rally.
Whether AI will take over content optimisations, only time will tell.
To wrap off, among the thousands of SEO tips that SEOs can draw from Google MUM, the most actionable one is probably to start considering the optimisation of images, audio and videos on the same wavelength as traditional text.
💡BONUS
You can find out how Google decided to handle title tags rewriting using Serp API with this Python framework
What comes Next?
The long journey throughout the evolution of Google search sounds like an ascending symphony and looks like a successful business case study.
Before Google popped up crawling the entire web and sorting information by entities, search competition was on a high with Yahoo, Altavista, and Lycos being the main online destinations. But then a breeze of relentless innovation cracked the competition with Google constantly training algorithms to keep pace with the ever-changing search patterns.
Without even sifting through this boring article you can feel yourself how the search experience has changed. You just need to search for a product and you’ll probably find yourself in the position to thrawl through multiple paid-product carousels and paid-link ads. Not to mention the various sponsored “buying guides” and Maps widgets showing stores selling products near your location.
In a nutshell, you’ll easily grow aware that much of the internet today is monetized to death.
I cover Google for a living so I am obviously aware how the results page has evolved over the years. Today, I was searching for “hearing aids” for my dad on my phone and I was stunned by the number of ads, and non-link results. It’s pretty stunning pic.twitter.com/jZZzDWRzdO
— Daisuke Wakabayashi (@daiwaka) March 13, 2022
As a result, I personally support the idea that the state of Google Search nowadays revolves around four main pillars:
1️⃣ Google is and will always be an advertising player
Sometimes we forget that the Motor View-based company is part of one of the biggest advertising conglomerates in the world, Alphabet. It’s not by chance that the search engine has always paid tributes to its advertising heritage as top ads are being shown for 63.8% of the 20K e-commerce keywords
This implies that SEOs and webmasters shouldn’t be freaking around earning the first spot on the SERP at all costs. This is in fact the sacred spot of paid content, so you’re not going to snatch it unless you pay.
2️⃣ Maps will populate SERPs as a weapon for competitive advantage
Although we haven’t touched on that in this post, no one would be surprised to hear of the fight between Amazon and Google to earn the biggest e-commerce market share.
Since one of the few features that Amazon is not equipped with by default is the juicy opportunity to customize users’ search experience on their geographic location. Hence, it comes as no surprise that Google is currently nuking the market by displaying as many maps as possible, especially for e-commerce-related queries.
Google seems to weaponize Maps as a competitive advantage tool at least when the search engine is unsure about the gist of one’s search intent.
3️⃣ Image Carousels will invade the SERPs as Search Intent is Unclear
A similar treatment is reserved for Image Carousels, except this apparently has nothing to do with tackling Amazon’s rankings.
Notwithstanding, we are once again all aware of how much Google cares about cluttering the SERPs with highly visual content. Over the past few years, we have observed the rise of such a trend which finds its reasons behind the need to level off the search experience with the current social media consumption patterns. Not to mention the apocalyptic rise of TikTok as the chosen one to replace Google as the major search engine of the future
Despite these apocalyptic scenarios, Google appears to show Image Carousels when the user intent is more fragmented or entirely tied up with an inspirational intent
4️⃣ Quality belongs to EAT
According to the July Quality Raters Update, YMYL sites have turned out to be those disclosing a potential to cause harm to users on topics that present health, financial stability, and safety of people. By these means, Google encouraged its Quality Raters teams to evaluate the content depending on the harm that could cause to users.
Then, what is Quality all about in the current state of Google Search?
In truth, I reckon the definition of Low-Quality pages is strictly intertwined with EAT.
If a page is deemed low quality, the content’s topic and purpose likely represent the issue. In layman’s terms, this means that regardless you’re a YMYL website or not, you should push your SEO toward considering the needs of your users.
Conclusion
It’s been a mad walkabout throughout the history of Google Search and I must admit that it’s not easy to draw a definite conclusion.
What I feel is Google as a search engine is not dying yet, rather it’s the way people search and browse the Internet that has changed over the years. And this is not going to stop.
Following the outstanding workaround AI and machine learning models, Google fast-paced the race to semantic search which, in turn, will throttle Page Rank impact and dampen link building strategies.
An ultimate important note to point out is the potential threat to Google’s leadership as a search engine. TikTok is cracking the social media market competition and soon will break into the search engine niche. Disposing of a more efficient core algorithm, to date the Bytedance company would win heads down the confrontation with Google.
But that comes once again to no surprise.
When Goliath couldn’t confront a huge player for more than 20 years, one day a smart David will have his day in reshaping the market.
Related Posts