When we think of search engines, the first one that comes to mind is Google. But Google was not the first search engine to appear. There have been many search engines before Google emerged and became dominant.
Here’s a brief chronology of various prominent developments in the field of search engine evolution:
The history of search engines dates back to 1990 when the first search engine “Archie” appeared on the scene. It downloaded directory listings of all files located on FTP (File Transfer Protocol) sites. It created a searchable database of numerous file names. However, it did not index the sites. The data was limited and could be easily searched manually.
There were numerous search engines after Archie. But one of the prominent ones was the first web robot that generates an index of the web called Wandex. However, it only measures the size of the web rather than indexing it.
The world’s first search engine as we know them appeared in September 1993. It was W3Catalog. It did not depend only on a crawler and indexer but also high-quality lists of websites.
Jumpstation was the first www resource that was launched in December 1993. It combined the three features web search engines (crawling, indexing, and searching)
1994 April saw the release of Web Crawler search engine which allowed the search for any word in any web page. This went on to become a standard practice of search engines.
In April 1994, the Yahoo! web directory was launched. But Yahoo built its own search engine only in 2002. Until then it outsourced the search to other companies.
In 1998, MSN launched MSN search portal using results from Inktomi. However, MSN later developed its own search technology in 2005 and renamed itself to Bing in 2009.
The domain name Google.com got registered in September 1997. It was available to the public in 1998 as a Search engine.
In 2003, Yahoo! started its own web crawler to power Yahoo! Search.
2004-5 saw Microsoft use its own indexer and crawler for MSN search.
In 2010, Google Instant was launched. It had a ‘search before you type feature’. This implies that Google will predict the query and display results. This uses the same technology as Google Suggest, Google’s Autocomplete.
2011 saw Google, Yahoo, and Microsoft announce Schema.org, a combined initiative that supported a rich range of tags that enabled websites to convey better information.
After this, Google enhanced its user experience by launching various algorithms. This made it the most popular among the search engines.
In the present day and age, Google and other search engines have become smarter than before. Processing and ranking of information are done using machine language. They can even understand natural human speech.
But this is only a recent development. Navigation of the internet was not so simple a few years ago. To search a website, you had to know its exact wordings. The reason for this was that search results were mixed with spam. In fact, indexing of new content by the search engine could take weeks!
Subscribe us to keep yourself updated
How do search engines work?
Retrieval of Information
When there is a query by the user, search engines return results. These results are ranked hierarchically based on trust and relevance signals.
Search engines browse the web in a methodical, automated manner.
Search Engines analyze pages according to the titles, headings, and specific fields. Searching by indexing is the fastest way to search.
Google changes its search algorithms 500-600 times annually. Although most of these changes are minor, some are major algorithmic updates that have a significant impact on search results.
For search marketers having an idea of the dates of these updates is important. This is because it reveals facts about changes in rankings, organic website traffic, etc. This data helps improve the search engine optimization efforts.
Let us see how search engines have evolved since 1990.
Algorithmic updates that have impacted search results:
It was launched on 24 February 2011. It penalizes for duplicate or plagiarized content, user-generated spam, and keyword stuffed content.
Panda assigns a quality score to web pages based on which they are ranked. To start with Panda was only a filter. However, since January 2016 it has become a core algorithm. In fact, Panda rollouts have become more frequent speeding up the penalties and frequencies.
- You can carry out regular checks of your site for content duplication, thin content, and keyword stuffing with tools such as SEMRush, etc.
- In case of an Ecommerce site use original images and create unique product descriptions based on customer reviews.
- Avoid trying to trick search engines
- Increase social signals by adding social sharing buttons.
- Clear out internal linking, especially of footer links and anchor-text-rich optimization.
- Avoid cross links from sites that are in your control.
It was launched on 24th April 2012. It penalizes spammy or irrelevant links and links with over-optimized anchor texts. The aim of this algorithm is to down-rank manipulative sites. It works in real time.
- You can run regular audits with a backlink checking tools such as SEM Rush to evaluate the growth of link on your profile. These tools depict the progress graph for your link’s profile growth. Any unusual spikes in the graph indicate unexpected backlinks you might have gained. You can also utilize Google Search Console to check all backlinks by the option “links to my site”.
- Avoid unnatural links such as
- Numerous links from irrelevant sources pointing to the website.
- Purchased links
- Ignoring the context and randomly using links.
- Link Schemes
- Keyword Stuffing
- Excessive SEO.
It was launched on August 22, 2013, with the purpose to minimize low-quality content. It penalizes content with keyword stuffing or poor quality.
Hummingbird provides searcher results based on his complete query rather than for individual words. Although keywords are important, Hummingbird enables ranking of a page for a query despite the page not containing exact words entered by the searcher. This is possible because of natural language processing that relies on latent semantic indexing, co-occurring terms, synonyms, etc.
You will need to broaden your keyword research. Your focus should be on concepts rather than keywords. Carry out a research on related searches, synonyms, and co-occurring terms.
You will need to orient your content so that it is focused around your audience. The content should address the needs the needs, concerns, queries of the audience. This type of content will provide better engagement.
- Long tail keywords:
Having long tail keywords enable better understanding of the query rather than simply ranking for it.
- Conversational content
A conversational tone in the content is preferred since the user searches information using it.
- Mobile Optimization
Conversational content that is optimized for mobile is prominent.
- Proper structured data markup
A structured data markup ensures better results.
- Build strong inbound and outbound link
This ensures better authority and credibility.
It was launched on 24th July 2014 in the US and December 22nd, 2014 in Canada, UK, and Australia.
It penalizes sites with poor On-page and Off-page SEO.
Searches in which, the location of the user plays a part are most affected by Pigeon. It creates closer ties between core algorithm and local algorithm. Local results are ranked based on traditional Search Engine Optimization criteria.
- Use localized names and addresses for better results.
- Maintain consistency in Name, phone number, and address.
- Ensure listing in directories.
- Getting good reviews from customers.
It was launched on 21st April 2015.
It penalizes lack of a mobile version of the page and poor mobile usability.
Pages are checked by Mobilegeddon which is Google’s Mobile Update. This tool ranks pages optimized for mobile on the top while the non-optimized pages are downranked or filtered from SERPs.
- Check the speed and usability of your site. Google’s mobile-friendly tests will depict the grey areas of your page’s mobile version. There are tools like SEM Rush that indicate the mobile responsiveness of your page.
- Monitor the Website Traffic.
- Make sure that your website is mobile friendly. Update it in case it is not mobile friendly.
6. Rank Brain
It was launched on 26th October 2015.
It penalizes for shallow content that lacks query specific features. It also penalizes a site for poor UX.
It is the third most important ranking factor. It identifies the query specific ranking factors. RankBrain is a part of Google’s Hummingbird algorithm that helps Google understand the queries and provide the best matching search results to those queries.
How to avoid penalties
The content should be optimized for relevance and comprehensiveness for competitive analysis. There are various online tools that help you find relevant terms and concepts being used by your top-ranking competitors. This will enable you to enhance the quality of your content and diversify it.
It was launched on September 1, 2016.
It protects against tense competition in your target location.
According to the Possum update, you are more likely to see a business address in your search results that is closer to your location. In other words, the search results are based on the location of the searcher. It also boosts the businesses which are located outside the city.
How to avoid penalties
You need to carry out a location-specific rank tracking. Possum made searches more volatile in local SERP. This warranted that local businesses target more keywords. Rankings should be checked from your target location. Tools like SEM Rush help you in location-specific tracking.
It was launched on 8th March 2017.
It penalizes thin, affiliate heavy or ad-centered content.
It is the latest update by Google. It targets websites that do not follow the guidelines of Google Webmaster. It penalizes sites with a blog with poor quality content that has been created only for the purpose of generating ad revenue.
How to avoid penalties
You should basically try to avoid trying to trick Google into thinking that your page is about something else when it actually contains only affiliate links. Earning ad revenue is completely justified as long as you are not cheating for it. Ads should be displayed only on high-quality pages with relevant and ample content.
After Fred, 2017 saw some more quality updates from Google
10. Broad Core Update
This update was launched on March 12th 2018.
The objective of this algorithm is to rank content that provide answers to varied search queries by the user. This algorithm benefits those pages that were earlier unrewarded. However, Google still insisted on rewarding quality content that was relevant.
9. Maccabees Update
It was launched in December 2017.
It relates to several minor changes to the “core algorithm.”
The main objective of this update is to enhance relevancy
The continuous evolution of Google has made it the most dominant search engine. With the various algorithms it has launched, Google has ensured providing the searcher’s query specific search results. Google seems to be working with an insight into what the searchers are looking for. Users do not have to cope with spammy and irrelevant content. The results are faster and more relevant to the keywords. With these fast changing algorithmic updates of Google, it becomes important to hire the services of a professional Search Engine Optimization company that is aware of the updates and can strategize your SEO techniques accordingly.