Re: Multiple commands with one system call
by kennethk (Abbot) on Sep 29, 2011 at 14:34 UTC
|
I know I could use the exec() call to start the program and then proceed to run the rest of my commands.
No. That's not how exec works - it replaces the location that perl is running with the specified system command. This means any code after an exec is dead code.
You may get your expected result with something closer to:
system ("vp -e $ENV{VREL} ; vscmd set-measure-mode fast; vscmd pg-address-check off");
Short of that, you'd probably need some sort of daemon, which is likely far more sledge than this hammer needs.
| [reply] [d/l] |
|
if (system(qw/vp -e/, $ENV{VREL}) != 0) {
# last command failed
system(qw/vscmd set-measure-mode fast/);
system(qw/vscmd pg-address-check off/);
}
I'm not sure which program gets the ctrl-C, but if it's not the Perl script, that should work. | [reply] [d/l] |
|
Thanks for the correction, regarding exec(). Is always good to learn new things. :-)
The format you suggested doesn't work. It only takes the first command and it ignores the other two.
Thanks for the input though!
| [reply] |
|
1
2
3
Are you interupting execution, with perhaps ^c? In this case, yes, you are killing all jobs simultaneously. You could execute all three by splitting into multiple system calls:
system ("vp -e $ENV{VREL}");
system ("vscmd set-measure-mode fast; vscmd pg-address-check off");
| [reply] [d/l] [select] |
|
|
|
Re: Multiple commands with one system call
by jethro (Monsignor) on Sep 29, 2011 at 15:09 UTC
|
One simple way to start more than one program simultaneously (on linux) is to use the shell to execute them in the background. I.e.
system("program1 &");
system("program2 &");
would run both programs concurrently, the last one wouldn't even need to run in the background. Programs in the background should not do any screen IO because they are detached from the terminal screen. But you can redirect output to files or /dev/null if that output is uniteresting
Another possibility is to use open with a pipe, i.e. open($f,"-|","program1"); which would enable you to get further input over STDIN to the program.
A third possibility is to use fork to do the (implicit) forking done by the previous methods yourself. And probably the very best possibility is to let a module like Parallel::ForkManager handle most of that work.
| [reply] [d/l] [select] |
|
Hey!
I had tried using the system call with the "&" sign but the problem with that is that I also need to know when this program is closed by the user so I can run clean up commands. When I make a regular system call I can just run my clean up commands after because they won't be run until after the program is closed.
Is there an easy way for me to catch when this program is being closed ?
Thanks again for all your help.
| [reply] |
|
It's worth considering whether you need this degree of control, or whether it would be enough to do something simpler...say, run a job once an hour automatically.
If you need this degree of control and interactivity, I would have a wrapper script which launches your program, storing the process id (PID ) of the process. Then at regular intervals it can verify the process is still running, using ps. When the process stops, the wrapper can send a message.
A simple way to indicate the status is to create an indicator file, and delete it at the end. You can 'touch' the file at intervals to demonstrate the process is running and not crashed. Alternate, using the open with a pipe strategy, you can send text messages at intervals, 'still running', 'crashed' or 'completed'.
As Occam said: Entia non sunt multiplicanda praeter necessitatem.
| [reply] |
|
| [reply] |
|
#!/bin/sh
# wrapper script
program1
touch /tmp/finishedwith$1
You could call this from perl with a random number as parameter (the $1) so that each finish would create a different file. In the perl script you would monitor for the existance of this file
This is easy but not very clean and safe. Another way is to use fork instead of system, because fork tells you the process id of the newly created process. And you can wait() for the process to finish (which would block your script) or check if the process is still running (somehow ;-).
But really, look at Parallel::ForkManager, all the functions you want are already there. With "run_on_finish" you can choose a subroutine that is called when the process finishes.
| [reply] [d/l] |
Re: Multiple commands with one system call
by hbm (Hermit) on Sep 29, 2011 at 15:05 UTC
|
How about wrapping those three commands into one shell script/batch file; and then calling that one wrapper?
| [reply] |
Re: Multiple commands with one system call
by graff (Chancellor) on Sep 30, 2011 at 05:57 UTC
|
I'm not sure if I fully understand your statement of the task, but in case it helps, here's a technique I've used to good effect when I want a perl script to run a series of commands in sequential order, and the script has to wait until all the commands are done before moving on:
#!/usr/bin/perl
use strict;
use warnings;
my @cmd_list = ( 'echo 1', 'echo 2', 'echo 3', 'echo 4' );
$|++; # turn off buffering
my $shpid = open( my $sh, '|-', '/bin/bash' )
or die "Can't open a shell process: $!\n";
for my $cmd ( @cmd_list ) {
print $sh "$cmd\n";
}
print $sh "exit\n";
close $sh;
waitpid( $shpid, 0 );
print "Shell's all done. Moving right along...\n";
| [reply] [d/l] |
|
my $shpid = open( my $sh, '|-', '/bin/bash' )
or die "Can't open a shell process: $!\n";
It says that, the preceding line has errors. I don't know enough about bash to fix it. Any chance you could help me out ? | [reply] [d/l] |
|
Hey yourself!
Is there something that makes you unable to paste the actual text of the error message into your post? What does "It" refer to? What does "the preceding line" refer to? What exactly did you try, and what exactly was the full error message? Were you using the code as I posted it, or did you try something "based on" my post? What OS are you using?
If you could give more explicit information about the problem(s) you're having, you might get some useful help.
| [reply] |
|
|