Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robot.txt is a file that is uploaded on a website to ensure that the pages are crawled by the spiders and crawlers according to your need. It is a very necessary command that is used while developing a website as the user can give the directions about whether the page of the website should be allowed to crawled or not. allow Google crawler to crawl webpage and send it for the indexing. Here you would get the facility of Robots.txt analyser that would help you in creating the perfect file for the website and it would also provide you with an upload able file which would instruct search engine robots on how to crawl them and index the website and pages.