Hello wise Monks,
My current project involves some parallel computing and I am working on a cluster. I use 'system' within my Perl script to pass instructions to the command line, which works okay. My problem however is that for commands that require a distribution of the job to other nodes on the cluster the processing returns to my script before such jobs have been completed. I need to be able to make the execution wait till completion before returning and proceeding with the rest of the script. For now I think it waits just for as long as it takes the head node to distribute the job and then returns to the script while the jobs are still being processed on the other nodes.
My script is something like this
#usr/bin/perl -w use strict; use warnings; . . . print "File split: STARTED\n"; system "/share/apps/RAMMCAP-2009-1106/qsub-scripts/qsub_fasta_split.pl + -i $file_in -o read-split_$file_in -n 46"; print "File split: COMPLETED\n"; my $readsplitf = "read-split_$file_in/output.1"; print "rRNA blast: STARTED\n"; system "/share/apps/RAMMCAP-2009-1106/qsub-scripts/qsub_rRNA_hmm_run.p +l -i $readsplitf -o rRNA_hmm_$file_in"; print "rRNA blast: FINISHED\n"; . . .
Subsequent steps following the initial RAMMCAP command do not get executed appropriately as results do not get available before subsequent steps run. I know there are ways to implement some kind of 'wait' on the cluster, but I do not know how. Any advice would be much appreciated.
Thank you very much.
In reply to Problem running cluster-based commands from perl script by eMBR_chi
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |