in reply to Using threads to run multiple external processes at the same time
1) Dump your arrays into files (probably CSV) into a directory shared e. g. via NFS or sftp.
2) SSH / telnet to the computing boxes (via 'expect' or equivalent perl modules) to start work packages:
"R commandfile <infile >outfile && rename outfile outfile.done
3) Depending on low memory requirements you could start all R processes at once. If it is just 'nice'ed CPU load I would not care.
4) When memory requirements would cause swapping, you could use the unix 'batch' command to queue work packages.
5) If the work packages include the latex table setting, you could have that 'mail'ed to you.
6) Alternatively a second script checks the NFS share and processes finished R output.
Don't spend too much effort reinventing infrastructure!
|
|---|