DrWho_100 has asked for the wisdom of the Perl Monks concerning the following question:

With a bit of your sage advice, I have written a simple Perl script to FTP files resident on my web server to a remote server. I need to repeat this process every 5 minutes, 24 hours a day. I can call the script using a server side include (shtml). With an automatic page refresh or javascript count down timer, the process works until a page refresh error due to timeout occurs. (No error appears in my error log when this happens.)

Perhaps this isn't the place to ask, but are there alternate methods to run the script to avoid the page refresh problem? I thought of php but virtual calls in php script appear to be locked out on my server. Also, I don't have shell or crontab access on the server.

Replies are listed 'Best First'.
Re: Executing Perl
by kyle (Abbot) on Aug 05, 2008 at 19:07 UTC

    I'd be inclined to write a cron job to trigger the web page that you've set up to do the job you want. On some reliably connected computer, have something that does a 'wget' or 'curl' on the right URL.

    For extra fun, use LWP::UserAgent, retry failures, log problems, and send emails.

Re: Executing Perl
by pjotrik (Friar) on Aug 05, 2008 at 19:12 UTC
    http://webcron.org (and probably others) provides a service where you set a cron job on their server, that triggers a script you make available through a web server.
      I would be inclined not to advise using an external service to interact with a production system, for such reasons as the maintenance, server uptime and availability of such services is beyond your control. I would second the advice given by kyle, and set this up to run from a machine on your network, which is properly supported.

      Cheers

      Martin