Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi All

I have made a script to send out mass mailing (opt-in list) and I have it set to send out a certain amount of emails to stop the server overloading.

The trouble is, I am executing the file from an SSH window and the file normally takes more than 10/15 minutes to run - by which time by terminal has timed out - will the perl file finish running? Or will it stop when my sesstion ends?

If there is a better way for me to do this to ensure it runs fully please let me know

Thanks

Replies are listed 'Best First'.
Re: File Running
by Abigail-II (Bishop) on Jan 14, 2003 at 14:59 UTC
    nohup your_command arguments &

    Abigail

Re: File Running
by Aristotle (Chancellor) on Jan 14, 2003 at 15:56 UTC
    man screen

    Makeshifts last the longest.

Re: File Running
by vek (Prior) on Jan 14, 2003 at 15:27 UTC
    You could place it in the background nohup it as Abigail-II suggests or just run it as a cron job. To ensure that it runs successfully, have the program write to a log when it is finished.

    Update: Thanks for pointing out my mistake dug, not enough caffeine yet :-)

    -- vek --
      Just a bit of claification. Running the process in the background won't help by itself. It is nohup that is the meat of Abigail-II's suggestion. It makes the process immune to hangup signals and sends the output to a non-tty.

      -- dug
Re: File Running
by osama (Scribe) on Jan 14, 2003 at 17:31 UTC

    Just a note, some hosts such as Pair Networks (my first host, I no longer use them) have a "Reaper Process" that would kill any process not belonging to a logged on user and/or taking a very long time...

    There are so many ready-made mailing list managers, why reinvent the wheel by programming your own and running it through ssh?