Scythe has asked for the wisdom of the Perl Monks concerning the following question:
I've recently started dabbling in Perl as I've started a wee hobby project up. It involves logging in to a https site and grabbing a page of data, then parsing out a snippet of info relevant to my needs.
This is all working hunky-dory, with only one small caveat: It uses a lot of data volume. I've set it to grab the page every ten seconds. Due to the nature of the data I'm grabbing this is about the minimum useful update rate. Each time it grabs the page it pulls down about 130-150kB of data. This adds up to significant quantities over any extended period of time. Given Australia's archaic volume limits.
What I'm seeking is a way of minimising the amount of data that $mech->get() will grab off a site. I want to disregard images, .css, essentially anything that isn't plaintext data on the site. I've found ways of changing what is provided by $mech-content(), but this is just formatting the data after it has been $mech->get()'d.
Your guidance in my time of need would be greatly appreciated.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Conserving bandwidth with WWW::Mechanize's get()
by ikegami (Patriarch) on Jun 05, 2008 at 02:31 UTC | |
|
Re: Conserving bandwidth with WWW::Mechanize's get()
by pc88mxer (Vicar) on Jun 05, 2008 at 01:26 UTC | |
by Scythe (Initiate) on Jun 05, 2008 at 01:34 UTC | |
by pc88mxer (Vicar) on Jun 05, 2008 at 01:52 UTC | |
|
Re: Conserving bandwidth with WWW::Mechanize's get()
by Gangabass (Vicar) on Jun 05, 2008 at 04:02 UTC | |
|
Re: Conserving bandwidth with WWW::Mechanize's get()
by Scythe (Initiate) on Jun 05, 2008 at 04:44 UTC | |
by Anonymous Monk on Jun 05, 2008 at 04:54 UTC |