I have used spider.pl to grab over 100,000 documents over a period of 16
hours. I have, at other times, run into some issues where spider.pl starts
using up a huge chunk of memory. I never was able to determine why it
happened sometimes and not others. You may want to try again to spider the
pages and keep an eye on the resources used. Your server might be killing it
off because it is using too much memory.
[mailto:firstname.lastname@example.org]On Behalf Of Ander
Sent: Wednesday, January 07, 2004 10:46 AM
To: Multiple recipients of list
Subject: [SWISH-E] Server or documet limit on spider.pl
I'm using spider.pl to index a list of servers, which I create dinamically
(from a database). When we have 2500 documents indexed (more or less),
spidering (and indexing, of course) stops.
It looks like spider.pl has a limit and It stops. But I've been looking for
any limitation there could be, but I don't find anything sospicious.
Can anyone help me?.
Received on Wed Jan 7 18:14:14 2004