robot.txt

The robots.txt file allows web developers to specify which subpages of a website may or may not be crawled by search engines. For example, admin pages can be specifically excluded from crawling using the Disallow directive, while other pages can be permitted with the Allow directive.