I have some code along the lines of:
If I have ~20 files in this directory no problem. But when I work on the order of a couple thousand files I reach my system limit.$path = "/var/tmp/decode1"; opendir(DIR, $path); @files = grep { /\.dat$/ } readdir(DIR); closedir(DIR); foreach $file (@files) { open(INPUT, "$path/$file") or die "can't open file $file: $!"; # do some stuff close INPUT; }
Shouldn't the close, close the file descriptor, then when it loops to my next file, open a new file handle? I'm not explicitly forking. I would think that I'm only working on 1 file at a time. (Not all 1000 of them.) I guess this is not the case.
So instead of trying to use threading (which I couldn't get to work properly.), should I just fork children myself then wait for the child to come back? All I want to do is work on one file at a time in a directory of my choice.
TIA
In reply to Question on fork and close. by sbank
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |