# # Robots file # # $Id$ # # User-agent # # The value of this field is the name of the robot the record is # describing access policy for. # # If more than one User-agent field is present the record describes an # identical access policy for more than one robot. At least one field # needs to be present per record. # # The robot should be liberal in interpreting this field. A case # insensitive substring match of the name without version information is # recommended. # # If the value is '*', the record describes the default access policy # for any robot that has not not matched any of the other records. It is # not allowed to have two such records in the "/robots.txt" file. # # Disallow # # The value of this field specifies a partial URL that is not to be # visited. This can be a full path, or a partial path; any URL that # starts with this value will not be retrieved. For example, Disallow: # /help disallows both /help.html and /help/index.html, whereas # Disallow: /help/ would disallow /help/index.html but allow /help.html. # # Any empty value, indicates that all URLs can be retrieved. At least # one Disallow field needs to be present in a record. # # The presence of an empty "/robots.txt" file has no explicit associated # semantics, it will be treated as if it was not present, i.e. all # robots will consider themselves welcome. # User-agent: * # all agents Disallow: /forms_doc.zip Disallow: /forms_pdf.zip