Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:
Hi Monks,
I'm forking out shell wget processes in a loop wanting them to work in parallel (I've not had any success getting the few available parallel perl modules working on this system), so I decided to just throw them out to the shell and let wget do the heavy lifing
unless ( fork() ){ #fork and execute, not waiting for a return exec("wget -T 10 -O out.txt $somesite ;if [ $? -ne 0 ]; then cp bad.tx +t out.txt ;fi"); }
The code works, but there are left over (sh) processes that are not removed and eventually I run out of ulimit free processes.
When I use a system call, it works, but naturally, it runs serially, which is too slow
What am I missing that keeps the (sh) processes from closing?
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: forking and exec
by zentara (Cardinal) on Oct 22, 2013 at 13:34 UTC | |
by Anonymous Monk on Oct 22, 2013 at 13:59 UTC | |
by kennethk (Abbot) on Oct 22, 2013 at 16:31 UTC | |
by Anonymous Monk on Oct 22, 2013 at 19:51 UTC | |
|
Re: forking and exec
by jellisii2 (Hermit) on Oct 22, 2013 at 13:35 UTC | |
by Anonymous Monk on Oct 22, 2013 at 13:51 UTC | |
|
Re: forking and exec
by Anonymous Monk on Oct 22, 2013 at 13:34 UTC |