I rather like the spider function not so much because I want to index a copy
of our site but more because the pages I need to make indices for our more
logically sorted by page functionality than by directory structure. But I
have a few questions.
1) Is it possible not to index certain files using the spider function? We
want to create an index of all March and April events which in essence are
all the links on the March and April pages. The only problem is we don't
want to index the 10 or 12 links in obiquitous navigational bar.
2) Is it possible vary the depth into which we wish index the individual
pages? That is we wish to follow page A one link deep but page B two links
3) Is it possible to index pages on other servers? When I try to index
(depth 2) the page http://www.heat.net/events/index.html which has a link to
http://www.segasoft.com/heat/20days/index.htm, I get the message that it can
not index http://www.segasoft.com/heat/20days/index.htm as the server or
method is wrong. If I add the line "Equivalent Server: http://www.heat.net
http://www.segasoft.com" every things works fine. But the two servers are
in absolutely no way equivalent!
4) Does equivalent server mean just servers or pages as well? Is there a
way to indicate http://www.heat.net/events,
http://www.heat.net/events/event-3-2000.htm are all the same page?
Heat.Net, SegaSoft Networks Inc.
Received on Thu Mar 23 20:41:54 2000