HOW TO GENERATE ROBOTS.TXT FILE
Learn how robots.txt syntax works
It is easy to work with robots.txt file. Robots.txt file uses two key words, a)User-agent and b)Disallow. User-agents are search engine robots most user-agents are listed in the Web Robots Database. Disallow is a command for the user-agent that tells it not to access a selective URL. On the other hand, to give Google access to a selective URL that is a child directory in a disallowed parent directory, then you can use a third key word, Allow.
Google uses somany user-agents, such as Googlebot for Google Search and Googlebot-Image for Google Image Search.Lot of Google user-agents follow the rules you set up for Googlebot, but you can stop this option and make specific rules for only certain Google user-agents as well.
This the syntax for using the keywords is as follows:
User-agent: [the name of the robot the following rule applies to]
Disallow: [the URL path you want to block]
Allow: [the URL path in of a sub directory, within a prevented parent directory, that you want to prevent]
Here i am going to how to generate robots.txt file for a Website or Blogger
1.Open Google search box
In the same page below you will find create robots.txt then
click on it.
4.Below you will get the robots.txt file for your website.
This is how we can generate the robots.txt file.












