As others have said, the simple solution is to use fork() to create a bunch of sub processes to handle each file. The problem with that approach is that if you have thousands of files, then the simple solution would create thousands of sub processes, which will bring you computer to a crawl.
If your requirements are simple, and you are running under unix/linux, then you could use the -P argument to xargs.
eg: Your processing script takes one input file, and can calculate the output file for itself.
cd ~/directory/with/files/to/process ls -1 | xargs -n1 -P 10 perl -w process_script.pl -args
Corion Suggested the use of Dominus's RunN Script which does broadly the same thing as xargs.
If I where in your situation, and my requirements where to complex for xargs, then I would write a script around Parallel::ForkManager.
This makes it possible to have more complex logic to decide what the output file should be from each input file, or to process some files in different ways. You also get a nice callback mechanism to handle errors and the like, all while the number of concurrent processing threads is limited to a number you specify.
Of course you could achieve the same by writing your own code using fork and signals, but why bother when there is already something available and debugged on CPAN.
In reply to Re: Running parallel processes without communication
by chrestomanci
in thread Running parallel processes without communication
by vit
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |