Skip to main content.
home | support | download

Back to List Archive

dynamically generating @servers for spider.pl

From: Bill Conlon <bill(at)not-real.tothept.com>
Date: Tue Nov 23 2004 - 00:42:50 GMT
The docs nicely explain how to index several sites in an index:

     my %serverA = (
         base_url    => 'http://swish-e.org/',
         max_depth  => 0,
         email       => 'my@email.address',
     );
     my %serverB = (
         ...
         ...
     );
     @servers = ( \%serverA, \%serverB, );

What I would like to do is generate the hashes on the fly, pulling them 
from a database.  So for example, if I can use DBI to execute some SQL 
to give me a resultset with each row containing the desired URI and 
depth:

my $sth = $dbh->prepare ("SELECT uri, depth FROM links WHERE 
spider='yes'");
$sth->execute ();

Would some real perl programmers take pity on me, and give me some 
clues about how to approach this for an arbitrary number of rows.  
Possilbly:

while there are rows in the resultset
	generate a hash from the current row
	push the hash onto @servers

I'm trying it along these lines:

while (@ary = $sth->fetchrow_array ())
{
%hash = (
     base_url    =>  $ary[1],
	max_depth => $ary[2],
     );
     push @servers, %hash;
};

but I get errors like
Can't use string ("max_depth") as a HASH ref while "strict refs" in use 
at /usr/local/lib/swish-e/spider.pl line 104.
Received on Mon Nov 22 16:42:52 2004