Operating systems limit how many files are opened *at a time*, but they don't limit how many files a process can open over its lifetime.
I don't see anything that would cause your problem, but I do have issues.
Use lexical file handles. It will ensure they get cleaned up and closed even if the subroutine exits abnormally.
What is open(STDIN, "-");? - means STDIN, so you're opening STDIN to be the same as STDIN?!
Also, calling dealError from within dealError is problematic.
Finally, what is dealError trying to do? Short of a missing floppy in the drive, I don't know of an error that can be "checked" and fixed.
I'd recommend:
sub proc{ my ($file) = @_; # (...) my $pathC = $path . "/$file"; open my $input, "zcat $pathC|" or do { print STDERR "\n!!ATTENTION:\n\tUnable to run zcat: $!\n"; return; }; while (my $line = <$input> ){ # (...) data proccessing irrelevant to the problem } # Not needed. Will happen automatically # when $input goes out of scope. #close($input); }
In reply to Re: Is there a limit of files I can decompress and then open in a script, if I close each one before openning other?
by ikegami
in thread Is there a limit of files I can decompress and then open in a script, if I close each one before openning other?
by shmf
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |