in reply to LWP versus Wget

Most of the time when I have to snarf a web page, I have to extract some data from it afterwards. I think it's *way* easier to do with the tools in perl than doing a
cat | sed | awk | sort | sed | diff | sed | sed | awk | sed
chain. In perl, I can assign the $response to a variable, walk through it, strip the html, the tabular data, verify it against what I've expected, and stuff it into a db - all in one program. AND I can check against any errors occuring in any of those steps.

I've known people who spend all their time in sed/awk and can whip up scripts to do everything there - and I'm sure people can do it in emacs and make and C. I choose perl. Whatever works for you.

Replies are listed 'Best First'.
Re^2: LWP versus Wget
by Anonymous Monk on Mar 04, 2005 at 10:03 UTC
    True, but $response = `wget -O- URL`; is shorter than use LWP::Simple; $response = get "URL";, while still enabling you to use the full power of Perl to parse it.