Roy Tennant <email@example.com> wrote:
> This means when a search is performed, I must *parse every hit* in order
> to extract the information for display. And this is using Perl, not a
> compiled language. To see the response time for a search that retrieves
The only real problem with perl not being a compiled language is that
you have to deal with slow downs at startup. It is fairly efficient
after that. To eliminate that delay, there is mod_perl for Apache and
nsapi for Netscape servers to allow you to have a constant perl
> So as much as I may work to increase efficiencies, I too often run into
> those who would not do something *at all* because it is too inefficient.
> In my opinion, good enough is often just that -- good enough. And CPU
> cycles don't do you one bit of good until you burn them.
I don't worry about the perl scripts on my system too much. During peak
usage for me it is swish itself that is limiting performance. When I get
the money I intend to quadruple memory (to 256MB) and maybe add a second
cpu (I'm using Linux on Intel hardware). It is not usual for me to see
that I've got three copies of swish running at once and one of them has
50% of memory allocated, and meanwhile my apache has reached the maximum
children I set for it, 150 processes. The whole machine is pretty slow at
times like that.
I have found through some experimentation though that it is a real slow
down for perl to get the results from swish and then go and open other
files to add things to the output. When I coerce all of the data to be
in the swish database and output with the matches, things get a lot
faster. For me this was document titles of newsspool style documents. I
use the Subject: contents as the title, but the standard swish build
would just use the filename.
an individual running a popular personal webserver
Received on Fri Jun 5 07:47:03 1998