Skip to main content.
home | support | download

Back to List Archive

Re: fhash.c bug?

From: Bill Moseley <moseley(at)>
Date: Wed Apr 27 2005 - 21:52:38 GMT
I think Jose has this fixed now.

Here's how I was reproducing the bug, and it now does not segfault:

    ~/swish-e/configure --enable-incremental --enable-psortarray --disable-docs --prefix=`pwd` CFLAGS='-O0 -g'> /dev/null && make install >/dev/null
    configure: WARNING: ** Buidling with developer only incremental indexing code **
    configure: WARNING: ** And using ARRAY presorted tables **

Create the initial index ("man" contains documents supplied by Peter):

    segfault$ bin/swish-e -i man/*.html -W0 -v0

Now index in a loop using the -u flag:

    while touch man/*.html && bin/swish-e -i man/*.html -W0 -v0 -u; do echo "Made pass"; done
    Made pass
    Made pass
    Segmentation fault

This now no longer segfaults.

Dobrica, does this solve the problem you have been having?

On Mon, Mar 28, 2005 at 08:45:30AM -0800, Dobrica Pavlinusic wrote:
> On Mon, Mar 28, 2005 at 07:02:24AM -0800, Peter Karman wrote:
> > > What is your smallest fileset on which you can demonstrate problem?
> > 
> > I can name that tune in 12 files. I have put a tar at 
> >
> > 
> > can you duplicate the error with that set? for me it consistently hangs on 
> > 'acl_size.3c.html'.
> Strange. It works for me just fine. I used latest CVS version compiled
> with:
> $ ./configure --disable-docs --with-pcre --enable-incremental --disable-shared
> I indexed it using:
> $ swish-e -S fs -f 1/test -i karman/
> And it worked for me. Then I tried
> $ swish-e -S fs -f 1/test -i karman/ -u
> which also worked. However, than I remembered that I had problems with
> repeated indexing of same data, so I wrote following script (in-lined so
> that list doesn't remove it):
> #!/bin/sh
> /swish-e -S fs -i karman -f 1/bug 2>/dev/null
> nr=0
> while true ; do
> 	nr=`expr $nr + 1`
> 	echo "LOOP: $nr"
> 	touch karman/* && ../swish-e -S fs -i karman -f 1/bug -u 2>/dev/null || exit
> done
> After about 500 loops it always dumps core on me. It doesn't dump
> core on same loop, however.
> I also noticed that "unique words indexed" count get incremented from
> time to time. Recompiling with --enable-psortarray didn't help.
> Well, now we have semi-reproducible bug. I do need to finish various
> other stuff for tomorrow, so further investigation is pending for now.
> -- 
> Dobrica Pavlinusic               2share!2flame  
> Unix addict. Internet consultant.   

Bill Moseley

Unsubscribe from or help with the swish-e list:

Help with Swish-e:
Received on Wed Apr 27 14:52:42 2005