In this Tutorial, I help viewers understand the way how search engines work.
Search Engines Crawl and Index webpages using sophisticated software programs called Bots.
Search Engines visit websites, parse its content one by one following internal links and also consider the leads to external links.
The software search engines used for crawling are called BOTS, They are run by a set of huge computers by following a complex formula called, Algorithm.
Search Engines don’t crawl password protected content.
After crawling websites, search engines comprehend the content and store the understanding they get in their databases, which is called indexing.
Remember, search engines can only understand text, the visuals that don’t have description are ununderstood, although they are there in the memory of search engines and can only be searched by the same visual not by text.
Like crawling, indexing is also a complex process, that require a huge database, hosted on thousands of computers.
source