in reply to Re: Synchronizing Multiple System Processes
in thread Synchronizing Multiple System Processes

Sorry I don't quite get your statement above.

You mean my case above should not happen? i.e. they script2.pl by default should wait until script1.pl finish?

in particular it gives such error message when running main:
script2.pl : failed to open input file output.txt : No such file or di +rectory
Please advise. Then what do you think is the cause of my symptom above?
Hope to hear from you again.

---
neversaint and everlastingly indebted.......

Replies are listed 'Best First'.
Re^3: Synchronizing Multiple System Processes
by ikegami (Patriarch) on Apr 13, 2007 at 05:44 UTC

    by default should wait until script1.pl finish?.

    Yup. In fact, the only exception is in Windows when using the special system(1, ...); syntax.

    Of course, if script1.pl launches a child and exits, the main script won't wait for the grandchild to complete. By the way, that means system("command &"); will cause system to return "immediately". (The child, sh, launches command and exits without waiting for command to exit.) Just remove that & if you have it.

    Then what do you think is the cause of my symptom above?

    Based on the limited information at hand, all I can say with reasonable certainity is that output.txt does not exist.

      Just remove that & if you have it.
      No. I don't have &.
      all I can say with reasonable certainity is that output.txt does not exist.
      Yes it is true. But it is because script1.pl hasn't yet finished in creating output.txt

      ---
      neversaint and everlastingly indebted.......
        If script1.pl returned without having finished creating output.txt, then it didn't create that file.

        What happens if you explicitly close the filehandle for output.txt in script1.pl?

        close OUT or die "Can't close filehandle OUT properly: $!\n";

        --shmem

        _($_=" "x(1<<5)."?\n".q·/)Oo.  G°\        /
                                      /\_¯/(q    /
        ----------------------------  \__(m.====·.(_("always off the crowd"))."·
        ");sub _{s./.($e="'Itrs `mnsgdq Gdbj O`qkdq")=~y/"-y/#-z/;$e.e && print}
Re^3: Synchronizing Multiple System Processes
by TOD (Friar) on Apr 13, 2007 at 05:27 UTC
    yes, it should not happen. but maybe this stanza from perldoc -f system can shed some light on your problem:
    Beginning with v5.6.0, Perl will attempt to flush all files opened for output before any operation that may do a fork, but this may not be supported on some platforms (see perlport). To be safe, you may need to set $| ($AUTOFLUSH in English) or call the "autoflush()" method of "IO::Handle" on any open handles.
      $| won't help. script1.pl flushed everything it had to flush on exit. (And if it didn't because of some crash, that data is gone forever. It was in the process, which longer exists. Waiting for it won't help.)
Re^3: Synchronizing Multiple System Processes
by doob (Pilgrim) on Apr 13, 2007 at 09:06 UTC
    Well, I've had this problem countless times in many languages. The problem is that while writing a new output file, that file may not(and probably does not) exist on the hard drive yet. What's more interesting is that it takes quite some time for the file to save on the drive. Waiting for a set amount of time *may* work but more often than not, I haven't seen it make a difference, no matter how long a time is waited. What you can do is combine the two processes into one. That would rectify the situation, but I have a feeling that there's a reason for not having just one process. One solution to your problem is outputting the output of process 1 to a file (if you need it) AND/OR simultaneously using expect.pm to grab runtime data from process 1 and to use it.