I think LWP is probably more suited for the type of work, you're looking for. :) WWW::Mechanize is built out of LWP, to save you work learning how to build WWW::Mechanize out of LWP, because 99/100 noobs who think (i need scrape this) don't know anything about HTTP, and can't wrap their minds around LWP (I need save request/response object? Whaaat?)
| [reply] |
Greetings.
In my humble defense;
I wrote an entire web page that would elicit HEAD, and every other request available in the HTTP 1.0 / 1.1 spec, including downloading the entire page. This includes sanitizing INPUT, creating the form fields, and adding graphics, and CSS. I completed the entire page in under 5 minutes, and I chose LWP, and only LWP. Why? Because inspite your assertion; WWW::Mechanize adds complexity, and overhead in this scenario. His request is a bone-headed/dead-simple request, that was exactly what LWP was made for.
In fact, to complete OP's request, would have only required one additional Module; HTML::Restrict, and there are others. The Module I listed will STRIP the HTML tags of choice. Leaving the OP with an easily controlled/formatted document to display, at the OP's wishes.
I hope this provides some insight for the OP.
--Chris
#!/usr/bin/perl -Tw
use Perl::Always or die;
my $perl_version = (5.12.5);
print $perl_version;
| [reply] |
| [reply] |