What Is Robots.Txt?

Robots.txt is a file created by a webmaster that tells search engine crawlers which pages or files the crawler can and can’t request from your site.

The file is typically used to avoid overloading a site with requests and is part of the web standards that regulate how robots:

  • Crawl the web
  • Gain access to content
  • Index content
  • Provide content to users