JGmonk has asked for the wisdom of the Perl Monks concerning the following question:

Hi All,
I am connecting to a remote host and excute commandos on the remote shell. First it gives me back a list of data what i am splitting up in arrays. Now i want to send a new commando for every item in the array and store the result.

I am not sure about what to use for more than 1000 commandos. below is the code how i prepared it now. I trierd the loop with size of 5 items - > this works. But i am worried about run it with 1000 items and how the remote server reacts.

#!/usr/bin/perl -w use Net::OpenSSH; use strict; use warnings; #connection: my $ssh2convey99 = Net::OpenSSH->new('...'); $ssh2convey99->error and die "Couldn't establish SSH connection: ". $ssh2convey99->error; #get some date from remote host and fill array my @ls = $ssh2convey99->capture("disptest | grep -E ' 900 ' | grep -v +1900 | grep -E -v '0 RE' | grep -E -v '0 WE' | grep -E -v '0 WAV' | g +rep -E -v '0 WA' | awk '{print \$3\";\", \$6\";\" }'"); $ssh2convey99->error and die "remote ls command failed: " . $ssh2convey99->error; # include variables and arrays for data preparation my @split_array; my @matrix_array; my $mitzl = 0; # prepare the data an store in the matrix foreach my $el (@ls){ $mitzl ++; @split_array = split(/;/,$el); $matrix_array[$mitzl] = [@split_array]; } #including variables and arrays for getting the needed data for all li +st elements my @bt; my @bt_matrix_array; my $length_matrix_array = @matrix_array; #running thru the list and send commando to remote host for every ele +ment. #store all input in new matrix for(my $i = 2;$i <= $length_matrix_array;$i++) { @bt = $ssh2convey99->capture("bt $matrix_array[$i][0]"); $bt_matrix_array[$i-1]=[@bt]; }

this will be the output for 1 element of the loop and the loop will send more than 1000

Welcome to lemans1 (LINUX) Order: 967065/1 Box 282477 Zeit | Stat | Info | Soll + | Ist ----------------+---------+-----------------------------------+------ +-+------- 20140328090153 | AV2 | TLJ3,gerade | + | 20140328090155 | ER01A | TS01,gerade | + | 20140328090204 | LJ7 | TS01,gerade | + | 20140328090213 | S01 | TK1,gerade | + | 20140328090305 | K1 | TS03,gerade | + | 20140328090353 | S03 | TS04,gerade | + | 20140328090415 | S04 | TS06,gerade | 0 + | 230 20140328090438 | S06 | TS08,gerade | + | 20140328090459 | S08 | TS11,gerade | + | 20140328090522 | S11 | TS12,gerade | 0 + | 220 20140328090624 | S12 | TS14,gerade | + | 20140328090645 | S14 | TS16,gerade | + | 20140328090707 | S16 | TS19,gerade | + | 20140328090730 | S19 | TS19,gerade | + | 20140328090911 | S19 | TK3,gerade | 284 + | 310 20140328090954 | K3 | TK5,gerade | + | 20140328091001 | K5 | TK1,gerade | + | 20140328091023 | K1 | TK2,gerade | + | 20140328091033 | K2 | TS28,gerade | + | 20140328091129 | S28 | TS28,gerade | 0 + | 320 20140328091514 | S28 | TS30,gerade | 414 + | 410 20140328091537 | S30 | TS32,gerade | + | 20140328091557 | S32 | TS33,gerade | + | 20140328093457 | S32 | TS35,gerade | + | 20140328093520 | S35 | TER05,weight | 618 + | 550 20140328095704 | S35 | TS20,gerade | 0 + | 580 20140328095806 | S20 | TS22,gerade | + | 20140328095830 | S22 | TS24,gerade | + | 20140328095849 | S24 | TS27,gerade | + | 20140328095912 | S27 | TS27,gerade | + | 20140328100053 | S27 | TK3,gerade | 703 + | 680 20140328100100 | K3 | TWG07,gerade | + | 20140328100201 | WG07 | TSK1,gerade | 0 + | 680 20140328100208 | SK1 | TSK1,gerade | + | 20140328105059 | ESM1 | TERR,lost | + | 20140328110542 | ESM1 | TERR,gerade | + |

what comes after is to write the data in a file during it loops.

thanks for your advice how to do it correctly.

Replies are listed 'Best First'.
Re: sending hundreds of commandos on SSH connection
by zentara (Cardinal) on Mar 28, 2014 at 15:57 UTC
    I am not sure about what to use for more than 1000 commandos

    If I were you, I would run all the 1000 commands as background jobs, just to avoid any network issues. You list the commands you want to run into a shell script, upload the shell script via ssh, then execute the 1000 command script as nohup background jobs. Then come back in at a later time, and collect the results.

    Using Net::SSH2 it is necessary to use the following shell syntax to execute background commands. Notice the closing of the standard filehandles, and the redirects to log files as output.

    If you did it correctly, it should avoid any hassle with the ssh connection.

    my $chan = $ssh2->channel(); $chan->blocking(1); $chan->exec("nohup /home/user/myscript > myscript.out 2> myscript.err +< /dev/null &"); $chan->send_eof; exit;

    I'm not really a human, but I play one on earth.
    Old Perl Programmer Haiku ................... flash japh
Re: sending hundreds of commandos on SSH connection
by frozenwithjoy (Priest) on Mar 28, 2014 at 14:49 UTC
    This sort of cyber warfare makes me nervous about connecting my computer to the internet! :P
Re: sending hundreds of commandos on SSH connection
by kennethk (Abbot) on Mar 28, 2014 at 15:54 UTC

    This is largely hardware dependent, but on standard hardware spawning 1000 process will kill the performance on each one due to swapping. You are generally far better off spawning a number just less than the number of available processors/cores, and issuing more commands as the old ones complete.


    #11929 First ask yourself `How would I do this without a computer?' Then have the computer do it the same way.

Re: sending hundreds of commandos on SSH connection
by moritz (Cardinal) on Mar 28, 2014 at 15:16 UTC
    But i am worried about run it with 1000 items and how the remote server reacts.

    Probably the same way as your local computer would react if you ran those 1000 commands locally. Unless of course the machines are very different, in which case you should consult the admin of the remote machine, and ask him.

Re: sending hundreds of commandos on SSH connection
by hazylife (Monk) on Mar 28, 2014 at 16:30 UTC
    grep -E ' 900 ' | grep -v 1900 | grep -E -v '0 RE' | grep -E -v '0 WE' | grep -E -v '0 WAV' | grep -E -v '0 WA' | awk '{print \$3\";\", \$6\";\" }'");

    That's an awful lot of greps, and they're all of the same "fixed-string"/-F variety. You could probably reduce the above to something like:

    my @ls = $ssh2convey99->capture(q/disptest | fgrep ' 900 ' | fgrep -v +1900 | grep -vE '0 ([RW]E|WA)' | awk '{print $3";", $6";" }'/);

    ...or replace the whole grep-awk chain with a Perl script!

    perl -nle 'next if !/ 900 /||/1900/||/0 ([RW]E|WA)/; my @f = split; print "$f[2]; $f[5];"'

    update: "0 WA" already takes care of "0 WAV"

Re: sending hundreds of commandos on SSH connection
by JGmonk (Initiate) on Mar 28, 2014 at 16:58 UTC

    Guys you are really quick. I changed the grep statement as written by "hazylife". Added saving to file. About the answers from "kennethk" und "zentara" i need to expand my horizon the next days. For testing it now with looping up to 20 times, i just added a sleep() (temporarely). The script will run after normal working hours of the server at night. So itīs not needed to be as quick as possible.

    #!/usr/bin/perl -w use Net::OpenSSH; use File::Copy; use Cwd; use strict; use warnings; #connection: my $ssh2convey99 = Net::OpenSSH->new('...'); $ssh2convey99->error and die "Couldn't establish SSH connection: ". $ssh2convey99->error; #get some data from remote host and fill array my @ls = $ssh2convey99->capture(q/disptest | fgrep ' 900 ' | fgrep - +v +1900 | grep -vE '0 ([RW]E|WAV?)' | awk '{print $3";", $6";" }'/); $ssh2convey99->error and die "remote ls command failed: " . $ssh2convey99->error; # include variables and arrays for data preparation my @split_array; my @matrix_array; my $mitzl = 0; # prepare the data and store in the matrix foreach my $el (@ls){ $mitzl ++; @split_array = split(/;/,$el); $matrix_array[$mitzl] = [@split_array]; } #including variables, arrays, date and file for getting the needed dat +a for all list elements my @bt_matrix_array; my $length_matrix_array = @matrix_array; my ($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) = localti +me(time); my $datum = sprintf "%04d%02d%02d" , $year+=1900 , $mon+=1 , $mday + ; my $file = "$datum.csv"; #running thru the list and send command to remote host for every +element. #capture everey command return in new array for(my $i = 2;$i <= 10;$i++) { my @bt = $ssh2convey99->capture("bt $matrix_array[$i][0]"); #kick out the first two lines because i donīt want them splice(@bt, 0, 2) ; #write the actual array in a file. make the file if not existi +ng. if already existing -> put the new data below the old one. add so +me meta infos open (DATEI, ">>$file") or die $!; print DATEI "TE: $matrix_array[$i][0] Boxtype: $matrix_a +rray[$i][1] \n @bt \n\n"; #make a break sleep(1); close (DATEI); } undef $ssh2convey99; my $dir = cwd; my $newdir = '//mnt/SharedFolder/'; move("$dir/$file", "$newdir/$file") or die "konnte Verzeichnis nicht o +effnen: $!\n"; ##http://perlmonks.org/index.pl?node=1080097
      fgrep -v +1900
      That should probably be fgrep -v 1900

      Appendix: I get this error on every executed statement from the remote host
      "tput: No value for $TERM and no -T specified"

      somebody knows what to add on the ssh connection to get rid of it?

Re: sending hundreds of commandos on SSH connection
by JGmonk (Initiate) on Mar 28, 2014 at 15:38 UTC

    the script wil be used as backup procedure every day to collect a selected type of data for long time storing. It is running within the same internal network.

    what i meaned to ask is: is it the right way to use the capture() function again and again. Will this function open always a new instance of the connection? Thanks, JGMonk