What is robots.txt?
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.
In practice, robots.txt files indicate whether certain user agents (web-crawling software) can or cannot crawl parts of a website.