in reply to cron/perl interaction gotchas i am missing?

We would need to see the script. Many people use cron and perl w/o a problem. I use it particularly with wget with cron as well.

BTW, I've never seen the username put before the perl script.


Play that funky music white boy..
  • Comment on Re: cron/perl interaction gotchas i am missing?

Replies are listed 'Best First'.
Re: Re: cron/perl interaction gotchas i am missing?
by schweini (Friar) on Jan 30, 2004 at 13:17 UTC
    i can't post the whole script (passwords, IPs and overall shame), but here're the basics:
    @locations = qw/1 8 9/; foreach $loc (@locations) { $dir = getDir($store); $url = getUrl($store); chdir($dir); Log("starting retrieval of '$url' to file '$dir/$fn'"); system("wget --timeout=180 $url"); }

    ...and as i said, it does finish the first wget perfectly...

      How are you sure that wget as finished and the wget process has terminated? I recommend using the logging options of wget plus adding some logging/syslog output to your Perl script as well - print statements should also be enough, as cronmails those to you.

      perl -MHTTP::Daemon -MHTTP::Response -MLWP::Simple -e ' ; # The $d = new HTTP::Daemon and fork and getprint $d->url and exit;#spider ($c = $d->accept())->get_request(); $c->send_response( new #in the HTTP::Response(200,$_,$_,qq(Just another Perl hacker\n))); ' # web
        How are you sure that wget as finished and the wget process has terminated?
        because the first file (from store 1) is exactly where it's supposed to be.
        i do use basic logging in my script (see above), but after the first wget, i don't get anything....
        (and running the script manually works splendid, so i think i didn't mess up URLs and such)
      I'm guessing it's your second url that's the problem. Have you tried simply using the first url more than once to see if it's not a problem between the cron'd machine and the target?

      Play that funky music white boy..
        ...but some logging i try to do after the first wget, and before the next iteration never get done, and in a terminal it all runs fine, and the urls are completely enviroment-independant....
      just wondering...why could this one have gotten a -- ?
      XP isn't (that) important to me, but i find this curious...