I wrote this simple script after some reading online.
To achieve fastest execution time, I want to have the number of jobs == arraysize. (any array in script for looping repeating process) I guess the other words, is instead of running stuff serially (one after the other), i have them in parallel to minimize time taken.
So in this case I wanna delete a directory, delete the directory1 in (/user/home/directory1/), but i am clueless why it does not work, all the files and subdirectories still there. Does the script automatically assign each files/subdirectories to each children?
Let me know if I have overlooked stuff. I am ready to learn . Thank you . :)
#! /usr/bin/perl use Parallel::ForkManager; $directory1 = "/user/home/directory1/"; my $pm = Parallel::ForkManager->new(10); my @files = glob("$directory1/*"); my $number = scalar(@files); for (my $i = 0; $i < $number; $i++) { $pm->start and next; system ("rm", "-fr", $files[i]); exit(0); $pm->finish; } $pm-> wait_all_children;
In reply to Parallel::ForkManager for any array by MissPerl
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |