Skip to main content.
home | support | download

Back to List Archive

RE: Server or documet limit on

From: Aaron Bazar <aaronb(at)>
Date: Wed Jan 07 2004 - 18:14:06 GMT

I have used to grab over 100,000 documents over a period of 16
hours. I have, at other times, run into some issues where starts
using up a huge chunk of memory. I never was able to determine why it
happened sometimes and not others. You may want to try again to spider the
pages and keep an eye on the resources used. Your server might be killing it
off because it is using too much memory.

Best regards,

Aaron Bazar

-----Original Message-----
[]On Behalf Of Ander
Sent: Wednesday, January 07, 2004 10:46 AM
To: Multiple recipients of list
Subject: [SWISH-E] Server or documet limit on

Hi all:

I'm using to index a list of servers, which I create dinamically
(from a database). When we have 2500 documents indexed (more or less),
spidering (and indexing, of course) stops.

It looks like has a limit and It stops. But I've been looking for
any limitation there could be, but I don't find anything sospicious.

Can anyone help me?.
Received on Wed Jan 7 18:14:14 2004