robots.txt is a text file containing rules that specify which files on a website can be accessed by a search engine crawler and, that follows the Robots Exclusion Standard. That is, You can control which files crawlers may access on your website with a robots.txt file.
This robots.txt file is located at the root of your site. for instance, www.example.com 's robot.txt file is located at www.example.com/robots.txt.
A robot.txt Generator is an advanced tool that allows you to easily create a robot.txt file according to the instructions given to a website.
Using the SEOCoreTools Robots.txt Generator tool will help you to generate robots.txt files easily in a few seconds.
How to use it?
> In the menu, All robots are allowed by default. but you can select one by one to Allow or Refuse in the given list.
> Then, "Crawl-delay" is set to "No-delay" but there are a few options to select if you want.
> In the third field, you will see whether to add your XML sitemap file. If you have one, Simply enter its location within this field. (Also you can generate XML SITEMAP here for free.)
> After that, you can block certain pages or directories from being indexed by search engines. this is usually done for pages that don’t provide important information to search engines and users, such as admin, login, and parameter pages.
> Finally, you will be able to download the robots.txt file by clicking on the "Create and Save as Robots.txt" button.
> Upload that downloaded file into the root directory of your domain.