I have to say I feel Douglas' pain. A good while ago I had a colleague
write a CGI to automatically create a SWISH-E configuration file, then
run SWISH-E to index based on the configuration file. We called it
"AutoSwish", but had to pull it due to security concerns with the CGI.
But I would definitely prefer to have a more shrink-wrapped
out-of-the-box situation which would make some beginning assumptions
that would work for most people, but then allow those who want to
tinker to tinker. Having said that, since I'm not the one to write it,
I must promptly close my mouth and end with my sincere appreciation for
all that has been done for swish over the years to make it into what it
On Tuesday, April 15, 2003, at 03:48 PM, Bill Moseley wrote:
> On Tue, 15 Apr 2003, Douglas Smith wrote:
>> Most people want to try to have a search index on the web,
>> that people can input a search string and get back a formatted
>> list of links and high lighted contents. You have this and
>> it works very well, but the quick start to get this has you
>> getting swish-e working, writing a config for it, getting the
>> spider working, writing a config for that, installing the cgi
>> and getting that working, and then writing a config for that.
>> Which in the end is what I wanted to do anyway. But it was
>> much easier to test Inktomi on the system, and Google, where
>> we could just fill out a few web form entries, and off it
>> goes (although with very little control), and ht-dig was
>> easier to setup (although much more difficult than Google
>> or Inktomi). It would be nice to come up with (or at least
>> try) a simpler install for this use case.
> I agree. One problem is swish-e doesn't have its own web server.
> It would be great if someone with more time could create such a
> setup. I'm actually working on the build system today -- so things are
> getting installed in more sensible places. Still, it's not point and
> Inktomi, Google, and HtDig are applications. Swish-e is a tool. ;)
> Really, the only excuse is that nobody had done it yet. I don't need a
> fancy installation program, so I haven't needed to create one.
>> Like a wrapper script which only needs the url, one regex
>> filter to limit contents indexed, depth of spidering and place
>> to install cgi scripts, and perhaps a couple other things
>> but not much. Then let it go, and look at results.
>> I mean there is a real gem here of a program, but it will
>> get skipped over for Google and Inktomi because I can test
>> those by the end of the day, and this one languished for
>> months, getting kicked around by people until it was finally
>> setup correctly.
>> I mean I like the multiple stages, and flexibility, in the
>> end I will probably use it all. It might be nice to be able
>> to have something working in one day without having to read
>> through all the config options.
> There's two problems. One is making something that will install and
> on a bunch of platforms. Without a web server built in that's a
> from the start. The second problem is the docs. I posted some steps
> get a site running and it wasn't that bad. Probably the best steps are
> listed in the swish.cgi docs. Still, it's not an easy installation by
> means. If it was too easy then too many people would use it and I'd
> too much time on this list...
> Bill Moseley firstname.lastname@example.org
Received on Wed Apr 16 00:51:21 2003