in reply to Automating downloads with Perl
If I understand you correctly, then the answer is a most definite yes! This is typically called screen scraping or spidering. The best tool I've found for the job is WWW::Mechanize. It's pretty self explanatory.
If you want to create WWW:::Mechanize scripts automatically, and you know that the sites you are using are not dependent on Javascript to function, you can use HTTP::Recorder to create a proxy that will record your actions with a website so that you can repeat them.
As far as code samples go, the above linked documentation provides lots of examples, but WWW::Mechanize::Examples is another good resource if you need it.
If you want to create WWW:::Mechanize scripts automatically, and you know that the sites you are using are not dependent on Javascript to function, you can use HTTP::Recorder to create a proxy that will record your actions with a website so that you can repeat them.
As far as code samples go, the above linked documentation provides lots of examples, but WWW::Mechanize::Examples is another good resource if you need it.
|
---|
In Section
Seekers of Perl Wisdom