On Fri, Apr 29, 2005 at 02:13:40AM -0700, S C wrote:
> Processing words in index '/swishe/new_rose.index':
> 20,985,685 words
> Writing word hash: ...err: Ran out of memory (could
> not allocate 251,803,608 more bytes)!
I added the commas because I was wondering if you had an integer
overflow. Doesn't seem like it. Still, it's a very large allocation.
> A memory problem .. :)
> The server I am running this on has 2 Gb RAM and is a
> dual processor.
> Guess this info should be of some help.
Well, some. Do you know how to run gdb? Place a breakpoint on
"progerr" and then do a backtrace when the program stops. That will
at least tell us which allocation is failing and see if that is
reasonable for such a large allocation. But, it's really hard to
debug something like this without the same files and hardware. So,
you may be in the best position to try and figure out the problem via
Unsubscribe from or help with the swish-e list:
Help with Swish-e:
Received on Fri Apr 29 07:11:29 2005