If you're looking at writing a spider, the WWW::Robot module should get some airtime. I used it with great success a while back to slurp the contents of a rather large and complex zoo of static pages into a dynamic engine. Especially cool in my eyes because it uses HTML::Treebuilder, which I also happen to like.
perl -pe '"I lo*`+$^X$\"$]!$/"=~m%(.*)%s;$_=$1;y^`+*^e v^#$&V"+@( NO CARRIER'
In reply to Re: Spidering websites
by Chmrr
in thread Spidering websites
by Whitchman
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |