perlassassin27 has asked for the wisdom of the Perl Monks concerning the following question:

hey guys i need help

i am running a perl script on a unix/linux dev box. where i need the script to send a command and if its taking too much time for the command to be be completed i want it to report an error and post on the error log.

$pull_cmd = "expect ssh.exp sid\@".$ipaddress." \"scp root\@comp2:/tmp +/10mbfile /tmp/\" perlassasin27";} my $timer=0; while((@returned_lines)=` $pull_cmd`) { $timer++; print "TIME:$timer\n"; + #debug message my $filesize = $bs * $count; my $timeout=((0.000012968376*$size)-1. +0862736)*2; if ($timer>$timeout) { open(ERROR_LOG,">ErrorLog-$Opt +ions{device}.log"); print ERROR_LOG "THE File is t +aking too long to transfer than anticipated. Check Connection."; close(ERROR_LOG); print "ERROR. check log for mo +re details.\n"; exit(1); } } my $size = $bs * $count; my $timeout=((0.000012968376*$size)-1. +0862736)*2; print "this is file size:$size\nthis is timeou +t time in sec: $timeout\n"; #debug Message
well i know i made a mistake in the while statement as it does the thing again and again until it doesnt exit.bt i want it to run the pullcmd statement and then exit if its pass d timeout and i cant think of any other way

Replies are listed 'Best First'.
Re: killing command if it takes too much time
by Argel (Prior) on Feb 10, 2011 at 00:02 UTC
    If you are okay with (or want) to kill the command that's taking too long then eval+alarm may be what you are looking for. Maybe this code will give you some ideas:
    sub cmd_wrapper { my $timeout = shift; my $cmd = shift; my( $out, $rc ); eval { local $SIG{ALRM} = sub { die "alarm\n" }; alarm $timeout; $out = (`$cmd`)[0]; # Get first line of output $rc = $?; alarm 0; }; if( $@ ) { die "Eval failed for $cmd for unknown/unexpected reasons" if $ +@ ne "alarm\n"; die "Eval failed for $cmd because alarm timed out"; } die "Return code undefined for $cmd" unless defined $rc; return $rc, $out if wantarray; return $rc; } my $rc = cmd_wrapper(300, $pull_cmd); # Will timeout in 5 minutes (300 + seconds) # Add your logging here

    Elda Taluta; Sarks Sark; Ark Arks

Re: killing command if it takes too much time
by pemungkah (Priest) on Feb 09, 2011 at 23:37 UTC
    As you have it right now, you're stopping execution of your program while the command executes because of the backticks. You want them both to happen at the same time, and to me that suggests either threads or fork; I'd personally use fork, but that's because I know the model better.

    Outlining this vaguely, I'd fork off the command, using a socketpair to communicate back to the main program. The main program could use select to simultaneously sleep a while and check for any output from the forked command. If it didn't find any after an appropriate delay, it could then kill the forked-off command and take whatever recovery actions were necessary. If it was getting output and decided to let the command continue, or the command completed, then it would simply read input from the socket while select said there was input available, and either go back to sleep, waiting for more input, or read to EOF then close the socket and waitpid on the PID returned by the fork.

Re: killing command if it takes too much time
by salva (Canon) on Feb 10, 2011 at 00:42 UTC
    Killing ssh client processes on timeouts is usually not a great idea as it will often leave orphan processes running on the remote host.

    Anyway, maybe, that will do what you want...

    use Net::OpenSSH '0.51_01'; my $ssh = Net::OpenSSH->new($ipaddress, user => 'pason', passwd => 'perlassasin27' +, timeout => $timeout, kill_ssh_on_timeout => 1); my @lines = $ssh->capture({stderr_to_stdout => 1}, scp => 'root@comp2:/tmp/10mbfile', '/tmp'); if (my $error = $ssh->error) { print $logfh, "scp command failed: $error\n"; }
Re: killing command if it takes too much time
by eyepopslikeamosquito (Archbishop) on Feb 10, 2011 at 04:32 UTC