Bots, spiders and crawlers
Search engines on the web consists of link-following-technology called bots, spiders, crawlers or agents who travel sites at regular intervals and automatically (without human intervention, which distinguishes them from the directory) to discover new addresses (URLs). They follow the hyperlinks (linking pages to each other) met on each page reached. Each page is identified then content is indexed in a database accessible by the Internet by keywords users enter.
<span style="\"font-weight:" bold;\"="">Search Engines are everywhere
Search engines does not apply only to the Internet. Some engines are software installed on personal computers. These are known as engines that combine desktop search through files stored on the PC and search through the websites. Examples: Exalead Desktop, Google Desktop and Copernic Desktop Search, etc.