What's the point in writing Perl to drive a static wget call? Either write bash or use LWP::Parallel::UserAgent.
Taking the risk of spawning a whole new discussion, I don't really see why people are so much against calling system programs for scripts that are intended to be run once or twice, on one machine, and just are quick hacks?
For instance, I once needed to pull some articlenumbers out of a database. Instead of pulling down DBI and the appropriate drivers (can really be a pain), and review how to use DBI, I just opened a pipe to the command line tool, wrote raw SQL to it and parsed the raw answer. Done in five minutes.
For a longtime, distributable or reusable solution, I'd of course used CPAN, but sometimes there is no point; Perl is designed to interact with the system, and to be, among other things a more powerful 'bash' (or DOS, or what have you). It can be a lot of wasted time to find the appropriate module, install it and figure out how to use it, when all you need is a quickie.
As for using bash (or whatever) instead, I don't really know how the syntax goes - I would have to look that up first, and if I need something parsed... better to combine strengths, and keep it simple.
Of course, in this case, it does assume that the user has wget... so for crossplatform solutions, of course you should go pure perl. :)
Better yet, find something that's more worth your while to code.
This I wholeheartedly agree on. :)
You have moved into a dark place.
It is pitch black. You are likely to be eaten by a grue.
| [reply] |
I'm not against using Perl to drive external programs. I just brought it up in this case since this a task that's rather easily done in bash. Not knowing the syntax is an argument I guess - although apart from a number of quirks regarding interpolation, it's pretty simple. Firing commands to an SQL server through its tool on the other hand - and especially processing the output, if you did so - is probably not something one can easily to with the shell. As such, it's really about choosing the right tool for the right job. In your case, Perl was probably it, in this case bash would be the ticket.
Just to illustrate the point, although I really didn't want to do so with this piece of code..
#!/bin/bash
end_it_all () {
echo
echo "did some funny things some $COUNT times... naughty, naughty"
exit
}
trap end_it_all HUP QUIT INT TERM
COUNT=0
HOST=localhost
while ((1)) ; do
wget -r -l100 -U "Stop sending me SPAM! This is the only way to ma
+ke you undestand." "http://$HOST" && end_it_all
((COUNT++))
done
Makeshifts last the longest. | [reply] [d/l] |
| [reply] |