Re: Synchronizing Multiple System Processes
by ikegami (Patriarch) on Apr 13, 2007 at 04:59 UTC
|
| [reply] [d/l] |
|
|
| [reply] |
|
|
Sorry I don't quite get your statement above.
You mean my case above should not happen? i.e. they script2.pl by default should
wait until script1.pl finish?
in particular it gives such error message when running main:
script2.pl : failed to open input file output.txt : No such file or di
+rectory
Please advise. Then what do you think is the cause of my symptom above?
Hope to hear from you again.
---
neversaint and everlastingly indebted.......
| [reply] [d/l] |
|
|
by default should wait until script1.pl finish?.
Yup. In fact, the only exception is in Windows when using the special system(1, ...); syntax.
Of course, if script1.pl launches a child and exits, the main script won't wait for the grandchild to complete. By the way, that means system("command &"); will cause system to return "immediately". (The child, sh, launches command and exits without waiting for command to exit.) Just remove that & if you have it.
Then what do you think is the cause of my symptom above?
Based on the limited information at hand, all I can say with reasonable certainity is that output.txt does not exist.
| [reply] [d/l] [select] |
|
|
|
|
|
|
yes, it should not happen. but maybe this stanza from perldoc -f system can shed some light on your problem:
Beginning with v5.6.0, Perl will attempt to flush all files opened for output before any operation that may do a fork, but this may not be supported on some platforms (see perlport). To be safe, you may need to set $| ($AUTOFLUSH in English) or call the "autoflush()" method of "IO::Handle" on any open handles.
| [reply] [d/l] |
|
|
|
|
Well, I've had this problem countless times in many languages. The problem is that while writing a new output file, that file may not(and probably does not) exist on the hard drive yet. What's more interesting is that it takes quite some time for the file to save on the drive. Waiting for a set amount of time *may* work but more often than not, I haven't seen it make a difference, no matter how long a time is waited. What you can do is combine the two processes into one. That would rectify the situation, but I have a feeling that there's a reason for not having just one process. One solution to your problem is outputting the output of process 1 to a file (if you need it) AND/OR simultaneously using expect.pm to grab runtime data from process 1 and to use it.
| [reply] |
Re: Synchronizing Multiple System Processes
by c4onastick (Friar) on Apr 13, 2007 at 04:59 UTC
|
You could try combining the two (assuming your on a *nix system) into something like this:
#!/usr/bin/perl
system("perl script1.pl input.txt output.txt && perl script2.pl output
+.txt finalout.txt");
That way you can let the system know that there's two scripts coming so you don't have to worry about it. | [reply] [d/l] |
Re: Synchronizing Multiple System Processes
by BrowserUk (Patriarch) on Apr 13, 2007 at 11:26 UTC
|
There is an awful lot of ... misunderstanding going on in this thread, and it is mostly caused by you wrongly describing the problem. Until you post the scripts that I can run that demonstrate the problem you are describing, I do not believe what you are saying is happening is possible.
For example, save the following three scripts and run them, and barring some external influence, like disk full, disk failure, or a third party piece of software taking exclusive access to the interchange file, it will run forever without errors.
##609823.pl
#! perl -slw
use strict;
my $count;
while( 1 ) {
print "Iteration: ", ++$count;
system 'perl 609823-1.pl';
system 'perl 609823-2.pl';
}
## 609823-1.pl
#! perl -slw
use strict;
my $count = int rand 100;
open OUT, '>', 'output.txt' or die $!;
print OUT 'A line of junk' for 1 .. $count;
close OUT;
printf "Wrote $count lines \t";
## 609823-2.pl
#! perl -slw
use strict;
open IN, '<', 'output.txt' or die $!;
my @lines = <IN>;
printf "Read %d lines \t", scalar @lines;
close IN;
unlink 'output.txt';
## Produces
c:\test>609823
Iteration: 1
Wrote 72 lines Read 72 lines Iteration: 2
Wrote 19 lines Read 19 lines Iteration: 3
Wrote 97 lines Read 97 lines Iteration: 4
Wrote 58 lines Read 58 lines Iteration: 5
Wrote 94 lines Read 94 lines Iteration: 6
...
All of which is just a long winded way of saying: There is something that you are not telling us going on here, and the quickest way of clarifying what that might be, is for you to show use the contents of the script you are using, rather than hypothetical equvalents which obviously aren't.
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] [d/l] |
|
|
| [reply] |
|
|
Simplified is good, but only so long as the simplification continues to exhibit the problem. I agree that posting too many scripts would detract people from trying to help, but there must be a point where one script (or a script called from it), writes the file, and the next script fails to be able to access it. Try isolating the writer and the reader and examining what happens at that point.
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] |
Re: Synchronizing Multiple System Processes
by Joost (Canon) on Apr 13, 2007 at 11:27 UTC
|
If script1.pl does not run in the background (does not return before it's actully done), and closes its filehandles when it's done* you should never get this problem.
What operating system do you use?
* update: file-handles are always flushed & closed when your program exit()s, but might remain open or unflushed if your program terminates with certain exceptions or when you call POSIX::_exit() or exec().
| [reply] |
Re: Synchronizing Multiple System Processes
by klekker (Pilgrim) on Apr 13, 2007 at 09:17 UTC
|
It's not a direct answer to your question but since script1.pl and script2.pl are obviously perl scripts (and I assume you can edit them), I would 'wrap' their code in a module and start like
# untestest
# ...
use lib1 qw(sub_for_perl1);
use lib2 qw(sub_for_perl2);
#...
sub_for_perl1($file_in, $file_out);
if ( -e $file_out) {
sub_for_perl2($file_out, $final_out);
}
or similar. So you can benefit from a better error handling etc. (It could be possible your result file isn't there because script1.pl fails already.)
I use this approach often since I try to avoid calling scripts from other scripts, if possible.
k | [reply] [d/l] |
Re: Synchronizing Multiple System Processes
by cdarke (Prior) on Apr 13, 2007 at 11:23 UTC
|
Another alternative might be to use a pipe: system("perl script1.pl input.txt - |".
"perl script2.pl - finalout.txt");
This assumes that the scripts are using ordinary open to open the output.txt file, which I assume is temporary and not really required. | [reply] [d/l] |