in reply to Re: exec command taking ages
in thread exec command taking ages

You are correct!

I added a close STDOUT and a close STDERR before the exec command and now the webpage displays 'Done' immediately.


Thnaks for your quick reply.


Paul McIlfatrick

Replies are listed 'Best First'.
Re^3: exec command taking ages
by pmcilfatrick (Initiate) on Nov 24, 2010 at 15:38 UTC

    I was too quick with my reply basing it on how quickly the browser displayed 'Done' and I had not checked the processed files - there had been no processing of the files.

    Adding a close STDOUT and a close STDERR before the exec command prevented the exec command from running and so no files were processed.

    Putting a close STDOUT and a close STDERR after the exec command results in the same long delay before the browser displays 'Done'.

    Paul McIlfatrick

      If the long-running process needs those handles (as it apparently does), try re-opening them to some file (before running the subprocess).

      In some more detail, the undelying problem is that the implicitly forked subprocess running under nohup is getting duplicates of the file handles of the parent process (your main script) at the time of the fork, and all "instances" of those file handles - which are connected via pipes to the web server - will need to be closed (or re-opened to elsewhere), before the web server considers the CGI job completed.

      -The other Anonymous Monk

        Your comments helped resolve the issue!

        All I had to do was to change the exec line from:

        exec "nohup ./error_summariser.cgi $identifier > $default_files_dir/$i +dentifier/processing-log.txt &" or die "Can't exec: $!\n";

        to:

        exec "nohup ./error_summariser.cgi $identifier > $default_files_dir/$i +dentifier/processing-log.txt 2>/dev/null &" or die "Can't exec: $!\n" +;

        Adding the sending of standard error to the null device now lets the webpage complete immediately.

        Thanks for your help.

        Paul McIlfatrick