The indexing followed by the merge of index files is
all in a single perl script. So, once the index files
are generated the merge command is executed via the
This works fine for a smaller set/size of index files.
However, the merge process ends abrubtly when trying
to merge larger index files.
Since the merge stopped I have now again executed the
merge command seprately redirecting the output and
error messages into seperate files.
Here are the last bit from the output file
Processing words in index '/swishe/new_rose.index':
Removed 2051 words no longer present in docs for
Writing main index...
Sorting words ...
Sorting 20983634 words alphabetically
Writing header ...
Writing index entries ...
Writing word text: ...
Writing word text: 10%
Writing word text: 20%
Writing word text: 30%
Writing word text: 40%
Writing word text: 50%
Writing word text: 60%
Writing word text: 70%
Writing word text: 80%
Writing word text: 90%
Writing word text: 100%
Writing word text: Complete
Writing word hash: ...err: Ran out of memory (could
not allocate 251803608 more bytes)!
A memory problem .. :)
The server I am running this on has 2 Gb RAM and is a
Guess this info should be of some help.
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
Received on Fri Apr 29 02:15:42 2005