KevinBr has asked for the wisdom of the Perl Monks concerning the following question:
The problem is, groups of files tend to arrive in this directory all at once. If 10 files arrive and I send these files in an array to the afork routine that limits them to running 3 at a time, I must wait until all 10 files complete before I can scan the directory again for any new files. This is is troubling when I have one file in that group of 10 that takes 20 minutes to complete, but the rest of the files completed in a few seconds. I am forced to wait until the one large file completes before I can begin processing any new files.sub afork (@$&) { # First field = array, second field = max number of processes to # run at the same time, third field = subroutine to run against # each array element my ($data, $max, $code) = @_; my $c = 0; foreach my $data (@$data) { wait unless ++ $c <= $max; die "Fork failed: $!\n" unless defined (my $pid = fork); exit $code -> ($data) unless $pid; } 1 until -1 == wait; } while(1) { @FILES = `ls -1 $DIRECTORY`; afork (\@FILES,3,\&process_file); }
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: read directory, fork processes
by ikegami (Patriarch) on Feb 24, 2010 at 01:24 UTC | |
|
Re: read directory, fork processes
by jwkrahn (Abbot) on Feb 24, 2010 at 02:59 UTC | |
|
Re: read directory, fork processes
by BrowserUk (Patriarch) on Feb 24, 2010 at 09:10 UTC | |
by roboticus (Chancellor) on Feb 24, 2010 at 12:13 UTC | |
by KevinBr (Acolyte) on Feb 24, 2010 at 22:34 UTC | |
by roboticus (Chancellor) on Feb 25, 2010 at 12:30 UTC | |
|
Re: read directory, fork processes
by cdarke (Prior) on Feb 24, 2010 at 11:50 UTC | |
|
Re: read directory, fork processes
by zentara (Cardinal) on Feb 24, 2010 at 13:16 UTC | |
by KevinBr (Acolyte) on Feb 24, 2010 at 15:24 UTC |