Robots.txt is a text file which allows a website to forward information and instructions to crawling bots. Google’s search engines use these web crawlers to analyze and categorize websites. The robots.txt file itself contains a number of technical instructions that these web crawlers read. The Robots.txt file thus helps Google’s crawlers to read and follow the instructions.
The Robot.txt file is particularly relevant in the context of SEO. This is because it allows you to exclude files and folders from crawling bots.