Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Padrão - todos os robôs são:  
    
Atraso de rastreamento:
    
Mapa do site: (deixe em branco se você não tiver) 
     
Robôs de pesquisa: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Diretórios restritos: O caminho é relativo ao root e deve conter uma barra final "/"
 
 
 
 
 
 
   



Agora, crie o arquivo 'robots.txt' no seu diretório raiz. Copie o texto acima e cole no arquivo de texto.


Sobre Robots.txt Generator

Robot.Txt is a text record design that is utilized for Web optimization (Site improvement), which contains mandates and directions to file the robots of web indexes that distinguish which sites can and can't be slithered. Robot.txt was made to stay away from issues with web search tools 'creeping' your site accidentally. The principal reason for the 'robot.txt' record is to let crawlers know which pages they can and can't file on your site. Robots.txt goes about as a list of chapters for your site, assisting web search tools with understanding how best to creep it.

The document of the robots.txt is a component of the REP (robot's rejection convention), which administers how the robots investigate the site, record material, access the site, and arrive at that substance to individuals. The REP (robot's rejection convention) likewise contains guidelines, for example, meta robots, as well as headings for how web indexes ought to decipher joins, (for example, "follow" or "no follow") on a page, subdirectory, or webpage wide premise.