Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Set up for free a custom robots.txt generator for blogger, and other websites you be using. Use our free online tools for SEO to generate a custom robots.txt file for free of course!

Robots.txt is a document that contains guidelines on the best way to have a search engine crawl a website. It is otherwise called robot rejection convention, and this standard is utilized by locales to tell the bots which part of their website needs ordering. Additionally, you can determine which zones you would prefer not to get prepared by these crawlers; such regions contain copy content or are being worked on.

Bots like malware finders, email gatherers don't pursue this standard and will check for shortcomings in your protections, and there is an impressive likelihood that they will start looking at your site from the regions you would prefer not to be recorded. 

A total Robots.txt document contains "Client specialist," and underneath it, you can compose different mandates like "Permit," "Forbid," "Slither Delay" and so forth whenever composed physically it may require some investment, and you can enter numerous lines of directions in a single record. In the event that you need to avoid a page, you should state "Forbid: the connection you don't need the bots to visit" same goes for the permitting property.

On the off chance that you imagine that is everything that matters in the robots.txt record, at that point it is difficult, one wrong line can prohibit your page from the indexation line. Thus, it is smarter to leave the errand to the stars, let our Robots.txt generator deal with the record for you.