Seeking to avoid re-inventing the wheel, can anyone point me towards anything pre-rolled to grab a web page and return the size of the page in bytes, both source and total (eg including the referenced images)? If I need to roll my own, I'm thinking of a mix of WWW::Mechanize, HTML::Parser, and Image::Size. But would rather use something existing, either perl, or a a perl wrap a linux command line app. Thanks for any ideas
In reply to determine web page size, w/ and w/o images by water
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |