On Mon, Oct 31, 2005 at 06:34:59AM -0800, J Robinson wrote:
> Any tips on how I can debug this? Is there a debug
> flag for spider.pl that shows robots.txt being parsed
> and/or urls being matched against it, or anything like
set the debug to "skipped" and it will tell you when a file is skipped
due to robots.txt.
Then just run the spider on one file they say it's skiping.
When I've debugged this in the past I found that the robots.txt file was
not setup correctly.
Unsubscribe from or help with the swish-e list:
Help with Swish-e:
Received on Mon Oct 31 06:43:59 2005