baxy77bax has asked for the wisdom of the Perl Monks concerning the following question:

hi,

is there a way to jump through ganglia and plant jobs? so the situation is: i have this cluster which consists of 5 nodes(ganglia). the cluster has Rocks OS installed on it. so what i would like to do now is to somehow execute a job on each and every node of the cluster, and i would like to make the whole process totally automatic, so i do not have to ssh through nodes and manually start the jobs on every node. my idea was to retrieve the names of all the nodes, and then through system function jump to that node and start the job on it. but this is not working

chomp(my $host = qx(hostname -s)); foreach (@$ganglia_name){ system("ssh $_") unless ($_ eq $host); # run the job }
the result i get is: logging the the node and exiting the program
Last login: Sat Jul 25 15:59:41 2009 from fish.local Rocks Compute Node Rocks 5.1 (V.I) Profile built 13:38 22-Jul-2009 Kickstarted 15:54 22-Jul-2009 [baxy@compute-0-3 ~]$
does anyone has any suggestion on how to do this

thank you

Replies are listed 'Best First'.
Re: switch through ganglia with perl
by Corion (Patriarch) on Jul 25, 2009 at 17:00 UTC

    Maybe you want to issue the jobs in parallel, using, for example, Parallel::ForkManager?

    As an aside, // is not a comment in Perl.

      // was a tipo :). well actually i am doing that but on every ganglia(node), so if i have 5 ganglia x 8 child(fork) that is 40 proc. so from what i can see -> Parallel::ForkManager <- is a typical forking module. does it automatically switches through nodes ?

      update : thank you ++

        I linked to the documentation of Parallel::ForkManager for a reason. You will need to give it the hostnames in the strings of the commands you want to run in parallel.