wanna_code_perl has asked for the wisdom of the Perl Monks concerning the following question:
Hello monks,
I've mostly written a Perl program that runs several external backup and file transfer programs, such as duplicity, scp, rsync, etc. Everything's fine except I'm not sure of the best way to actually execute the commands. I'll show you the relevant subroutine, then I'll get into specifics:
# Run executable @system command, reporting name as $name If there # is an upload limit, we run through trickle(1) to limit bandwidth. sub _ext_cmd { my ($name, @system) = @_; if ($o{general}{upload}) { @system = ($o{general}{trickle}, $o{general}{trickled} ? () : '-s', -u => $o{general}{upload}, '|', @system); } say "Running name=$name, @system"; # system { $system[0] } @system[1..$#system]; # Obviously nope. say "\$?=$?"; }
# Excerpt of %o options hash: %o = ( general => { upload => '256', # KBps trickle => '/bin/trickle', trickled => undef, # True if trickled is installed }, ); # Example usage: _ext_cmd(display_name => qw!/bin/duplicity /path/to/src /path/to/dest! +); # ... but you could probably test it just fine with echo or /bin/cat.
A couple of important points:
Right, that was more than a couple points. I swear I've done this a hundred times before, but for whatever reason my distracted brain doesn't want to put this puzzle together on its own today.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Capturing output from a long-running pipeline
by talexb (Chancellor) on May 23, 2017 at 17:56 UTC | |
by wanna_code_perl (Friar) on May 24, 2017 at 03:01 UTC | |
by talexb (Chancellor) on May 24, 2017 at 03:50 UTC |