How do I use a robots.txt file?
Applies to: Grid System
The robot guidelines are universal and don't depend on any server configuration.
All you have to do is put a robots.txt file in your DocumentRoot (which is normally /htdocs/www/ as you see it when you login with ftp/telnet/ssh). Any of your subdomains has it's own DocumentRoot, so you'll need to make multiple robots.txt files in /htdocs/[subdomain]/ if you want to control robots in there.
When a robot arrives at your site, it requests a robots.txt file from the top level directory (the DocumentRoot) just the same as it requests index.html.
The robot basically goes:
This file is the first thing a robot asks for when it arrives, and if the file exists, the robot is supposed to read and obey the directives in that file.
More information about how to direct robots with directives in the robots.txt file is here:
18834/0%Last update: 2010-10-03 16:26
Author: FAQ Admin
You can comment this FAQ