sbank has asked for the wisdom of the Perl Monks concerning the following question:
I have some code along the lines of:
If I have ~20 files in this directory no problem. But when I work on the order of a couple thousand files I reach my system limit.$path = "/var/tmp/decode1"; opendir(DIR, $path); @files = grep { /\.dat$/ } readdir(DIR); closedir(DIR); foreach $file (@files) { open(INPUT, "$path/$file") or die "can't open file $file: $!"; # do some stuff close INPUT; }
Shouldn't the close, close the file descriptor, then when it loops to my next file, open a new file handle? I'm not explicitly forking. I would think that I'm only working on 1 file at a time. (Not all 1000 of them.) I guess this is not the case.
So instead of trying to use threading (which I couldn't get to work properly.), should I just fork children myself then wait for the child to come back? All I want to do is work on one file at a time in a directory of my choice.
TIA
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Question on fork and close.
by kschwab (Vicar) on May 29, 2001 at 23:57 UTC | |
by sbank (Initiate) on May 30, 2001 at 00:25 UTC | |
by Anonymous Monk on May 30, 2001 at 00:37 UTC | |
|
Re: Question on fork and close.
by Anonymous Monk on May 29, 2001 at 23:41 UTC | |
by sbank (Initiate) on May 29, 2001 at 23:58 UTC | |
by kschwab (Vicar) on May 30, 2001 at 00:19 UTC |