A few questions:
I will be maintaining and archive of 2 to 5 years in duration. Each day
will have approximately 25MB single html file or about 6.6 GB per year. Is
the swish-e index database regenerated from scratch each time new html's
are added to the index? How long does it take to process 6.6GB of data?
If I have to recreate the index from scratch each time this will not be
feasible. Therefore I am also trying to write a perl script to call the
grep system function passing it a date range and multiple search criteria,
at this time only 'and' logic will be implemented. I have got it working
for a single search criteria, but when I expand the logic using foreach
logic it no longer returns any values. I am in the process of just
learning perl and need HELP!
Received on Thu Jan 6 15:15:13 2000