Skip to main content.
home | support | download

Back to List Archive

Narrowing http spidering

From: <sbower(at)not-real.mcls.rochester.lib.ny.us>
Date: Fri Jan 28 2000 - 19:31:43 GMT
I've recently installed SWISH with  no problems and can index my local
site and have used the spidering feature to index other sites.

My only problem is that it still seems that I can only limit the number
of links to be followed by the spider.
What I really need to do is narrow the spidering to specific
subdirectories on other servers.  Since these are not my servers, I
can't use a robots.txt file.  I've searched the archives and found this
topic discussed but didn't really find a good solution. The latest
message concerning this I found was dated August 1999
so I'm hoping that maybe something has changed since then.

Is there simple way that I can do this with the SWISH-E spider?

Thanks.

Shirley Bower
Rochester Regional Library Council
Received on Fri Jan 28 14:37:26 2000