Skip to main content.
home | support | download

Back to List Archive

Re: Exact same 'Ran out of memory' error on 2 different

From: José Manuel Ruiz <jmruiz(at)>
Date: Thu Apr 29 2004 - 08:13:11 GMT

It's difficult to say wich is the reason without knowing the exact point 
of the program.

Trying to gess... the error happens when worddata (worddata contains all 
the positions,
files, etc) is being writing. So, perhaps, it is allocating memory for 
building the buffer containing
all the data for a too common word (eg: a, the) in a very big collection 
of files.


Bill Moseley escribió:

>On Wed, Apr 28, 2004 at 02:03:42PM +0100, William Bailey wrote:
>>Writing index entries ...
>>~  Writing word text: Complete
>>~  Writing word hash: Complete
>>~  Writing word data:   9%err: Ran out of memory (could not allocate
>>30315957 more bytes)!
>Ya, try a newer version.  Seems like Jose fixed a signed integer
>overflow, although I'm not sure if it's related. 
>Might as well run the indexed under gdb and try and get a back trace.
>Set a breakpoint on "progerr" -- that's called to display that error
>message and then start indexing.
>How much are you trying to index?  Are there very large files or is it
>just a lot of small files?
>[Jose, can you imagine any reason why it would be trying to malloc
Received on Thu Apr 29 01:13:12 2004