Skip to main content.
home | support | download

Back to List Archive

Disallow doesn't work

From: Suzanne Hallam <hallam(at)not-real.lh-systems.com>
Date: Thu Feb 01 2001 - 18:40:12 GMT
I tried to disallow in the robots.txt file (sitting under htdocs) the
swishspider, but it didn't work?
 
I used 
 
User-agent: swishspider
Disallow: /some/directory/subdirectory
 
Can you explain what variables might be affecting this? 
 
I had swish-e configured so that it would only go to this one main
directory...which is a password protected (.htacess) directory. The
structure for swish-e to index is htdocs/some_protected_directory/ and it
does index this directory. Then I thought I wanted to keep it out of a
subdirectory there, so I did the disallow in the robots.txt  Since this did
not work, I am now questioning all the directories being protected by the
robots.txt and moreover I am trying to figure out how to keep the contents
of the index from displaying until first the requestor has entered a
password.???

Suzanne Hallam
Marketing Communications Coordinator and Webmaster
LH Systems, LLC 
10965 Via Frontera 
San Diego, CA 92127 
TEL 858-675-3335 extension 136 
FAX 858-675-3345
hallam@lh-systems.com 

 
Received on Thu Feb 1 18:44:15 2001