At 02:33 AM 06/04/02 -0700, Cristiano Corsani wrote:
>I'm finally indexing my whole DB (1.200.000 record) by mean of a java
>filter that convert record fields in XML2 for swish. Due to buffering
>problem I work in setp of 10.000 records each time, after each step I
>"garbage" the memory. I verified my filter works generating the whole
>1.200.000 XML2 "virtual files" correctly. When I use the filter with swish
>I saw that every step an amount of memory is lost. It is a problem for
>indexing performance ... have you an idea of possible memory leaks of swish
Can you explain a bit more of the process. You say you work in chunks of
records, are you using -S prog and building a collection of docs in memory
and then passing those to swish?
The short answer is that if look edit mem.h you will see:
/* MEM_TRACE checks for unfreed memory, and where it is allocated */
#define MEM_TRACE 0
Set that to 1 and rebuild swish. If there's any unfreed memory at the end
of indexing it will be reported.
There are a few things that are not cleaned up after parsing a config file,
but those should now grow with indexing, as far as I've seen. But I
haven't been able to test every combination of config settings.
Of course, if you see a leak please post a small example and I'll track it
Received on Tue Jun 4 14:34:52 2002