I've recently started dabbling in Perl as I've started a wee hobby project up. It involves logging in to a https site and grabbing a page of data, then parsing out a snippet of info relevant to my needs.
This is all working hunky-dory, with only one small caveat: It uses a lot of data volume. I've set it to grab the page every ten seconds. Due to the nature of the data I'm grabbing this is about the minimum useful update rate. Each time it grabs the page it pulls down about 130-150kB of data. This adds up to significant quantities over any extended period of time. Given Australia's archaic volume limits.
What I'm seeking is a way of minimising the amount of data that $mech->get() will grab off a site. I want to disregard images, .css, essentially anything that isn't plaintext data on the site. I've found ways of changing what is provided by $mech-content(), but this is just formatting the data after it has been $mech->get()'d.
Your guidance in my time of need would be greatly appreciated.
In reply to Conserving bandwidth with WWW::Mechanize's get() by Scythe
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |