Other Software

ID #197

How do I use a robots.txt file?

Applies to: Grid System

The robot guidelines are universal and don't depend on any server configuration.

All you have to do is put a robots.txt file in your DocumentRoot (which is normally /htdocs/www/ as you see it when you login with ftp/telnet/ssh). Any of your subdomains has it's own DocumentRoot, so you'll need to make multiple robots.txt files in /htdocs/[subdomain]/ if you want to control robots in there.

When a robot arrives at your site, it requests a robots.txt file from the top level directory (the DocumentRoot) just the same as it requests index.html.

The robot basically goes:




This file is the first thing a robot asks for when it arrives, and if the file exists, the robot is supposed to read and obey the directives in that file.

More information about how to direct robots with directives in the robots.txt file is here:


Last update: 2010-10-03 16:26
Author: FAQ Admin
Revision: 1.2

Digg it! Share on Facebook Print this record Send FAQ to a friend Show this as PDF file
Please rate this FAQ:

Average rating: 0 (0 Votes)

completely useless 1 2 3 4 5 most valuable

You can comment this FAQ