Skip to main content.
home | support | download

Back to List Archive

RE: Problem with dev-25?

From: Haapanen, Tom <tomh(at)not-real.metrics.com>
Date: Sun May 19 2002 - 14:38:45 GMT
Bill,

Things seems to be working fine on Alpha for the most part, using NetBSD
1.4.1.  

The indexed (text) files are generally not large, and I expect that there
are relatively fewer unique words.  (I'm indexing all the news articles on
our web site -- but in their pre-HTML form.)

As for memory, I'm planning to bump it up as soon as I have a good time to
bring the server down, but the motherboard is unfortunately limited to 384
MB, so I can only add 64 MB.  But even with the current config, swap file
usage was minimal while swish was running -- there is no X (or other big
apps, other than Apache) on this machine.

Your other message said that you were able to reproduce the problem on your
Athlon, which is great news.  Let me know if you need me to gdb it to find
out what's going on on the Alpha.

But I'll give it another try with IgnoreLimit removed ... I don't really
need that, just left it in from the sample.

Tom Haapanen
tomh@motorsport.com


-----Original Message-----
From: Bill Moseley [mailto:moseley@hank.org]
Sent: Saturday 18 May 2002 23:37
To: Multiple recipients of list
Subject: [SWISH-E] Re: Problem with dev-25?


At 08:09 PM 05/18/02 -0700, Haapanen, Tom wrote:

Hi Tom,

>The second one, though ... running on an Alpha/533 with 320 MB, indexed
>about 90,000 files, and memory usage (using -e) was about 30 MB while
>swish-e was reading the files.  Memory usage jumped up to about 90 MB once
>the common word removal started.

What OS are you running?  

I just tested indexing about 1300 files on Alpha running Debian Linux 2.2.

Using -e worked.  Using IgnoreLimit worked.  Using both at the same time
caused a segfault.

I would try without using IgnoreLimit.

Your memory usage seems low, actually, but I don't now what you are
indexing.  On my machine I see about 65MB for 25,000 files.  I'd expect to
see more memory used on the Alpha due to the size difference of longs and
pointers.  320M of RAM is not that much.

>Still, things looked to be OK, but something got stuck when it got to
>writing index entries.  CPU usage stayed maxed out.  The files were not
>being updated, and yet the process stayed running.  I finally killed it an
>hour later.

When a process is stuck it's helpful to run strace or truss on the process
to see what it's doing.  And even more helpful if you can run it under gdb
and ^C when it hangs and get a backtrace.

The other thing I'd keep an eye on when indexing that many files with not a
huge amount of RAM is if the machine starts to swap.  Indexing will slow
way down if swapping, of course.


-- 
Bill Moseley
mailto:moseley@hank.org
Received on Sun May 19 14:38:51 2002