I have never used HTTP feature before, but finally I have used it to
check's Bryan's problem with swishspider (read previous posts).
I have noticed that this option is slow. I am wondering why. As you
know, an external perl program is called for getting each page from
the server. Obviously, each time swishspider is called, a perl
interpreter must to be loaded in memory. It also needs to load the
program and the required modules. The install of the required perl
modules is also tedious (Digest-MD5, libnet, libwww-perl, HTML-
Parser, HTML-Tagset, MIME-Base64, URI) or perhaps I did not it
I am wondering if there is a way to avoid the use of swishspider. I
saw a reference to libwww in the discussion list (from Mark Gaulin). I
do not know if the effort worths it.
Received on Mon Sep 25 15:22:47 2000