many thanks for helping with the problem of naming the swishspider perl
script. The spider now does find the index page of our site and start
spidering - but for some reason after finding one more link it then fails!
All the links on the index page are static html pages residing on the same
server, and do not even involve subdirectories, all residing in the same
secondly, just out of interest, can swishspider spider pages created
dynamically by cgi scripts - or is it confined to static HTML pages like
many spidering programs?
Andrew Cadman wrote:
> Dear all,
> we have configured swishspider on our system but when we run an indexing
> test through a makefile routine, we find that a "not found" error is
> returned. The pathways stated are definitely correct, - we know this,
> because if you change the access permissions on the script an "access
> denied" error is returned instead. It would therefore seem the most
> likely cause of the "not found" error is due to the fact that our server
> is configured only to execute perl scripts if they have the .pl
> extension: however swish-e automatically looks for the "swishspider"
> rather than "swishspider.pl" version of the program.
> Therefore, we just need to know where the name "swishspider" is defined
> in the swish program so that we can add the .pl extension to it.
> many thanks
> Andrew cadman
> NetLondon webmaster
Received on Fri Oct 30 08:25:06 1998