At 08:09 PM 05/18/02 -0700, Haapanen, Tom wrote:
>The second one, though ... running on an Alpha/533 with 320 MB, indexed
>about 90,000 files, and memory usage (using -e) was about 30 MB while
>swish-e was reading the files. Memory usage jumped up to about 90 MB once
>the common word removal started.
What OS are you running?
I just tested indexing about 1300 files on Alpha running Debian Linux 2.2.
Using -e worked. Using IgnoreLimit worked. Using both at the same time
caused a segfault.
I would try without using IgnoreLimit.
Your memory usage seems low, actually, but I don't now what you are
indexing. On my machine I see about 65MB for 25,000 files. I'd expect to
see more memory used on the Alpha due to the size difference of longs and
pointers. 320M of RAM is not that much.
>Still, things looked to be OK, but something got stuck when it got to
>writing index entries. CPU usage stayed maxed out. The files were not
>being updated, and yet the process stayed running. I finally killed it an
When a process is stuck it's helpful to run strace or truss on the process
to see what it's doing. And even more helpful if you can run it under gdb
and ^C when it hangs and get a backtrace.
The other thing I'd keep an eye on when indexing that many files with not a
huge amount of RAM is if the machine starts to swap. Indexing will slow
way down if swapping, of course.
Received on Sun May 19 03:37:03 2002