in reply to Question on fork and close.

As mentioned in the other comment(s), perhaps @files is too large ? You don't mention what actually happens.

Try this instead:

path = "/var/tmp/decode1"; opendir(DIR, $path); while (defined($file=readdir(DIR))) { next unless ($file =~ /\.dat$/); open(INPUT, "$path/$file") or die "can't open file $file: $!"; # do some stuff close INPUT; } closedir(DIR);
This should help, since readdir() in a scalar context returns just the next entry. If this doesn't help, supplying any error message you are running into would help us to give you a better answer.

Replies are listed 'Best First'.
Re: Re: Question on fork and close.
by sbank (Initiate) on May 30, 2001 at 00:25 UTC
    Well long live the error message! It definitely is pointing to my open2 call.
    open2: fork failed: Resource temporarily unavailable at /home/sbank/gz +ip.pl line 74
    Here's more meat from my script. (You can call it a sloppy joe, because of my poor coding.)
    { open(OUTPUT, "$outfile$$") or die "can't open file $outfile$$: $!"; open2(\*GZIP_IN, \*GZIP_OUT, "$gzip -dc -q $outfile$$") or die "can +not open2 $gzip: $!"; until ( eof(OUTPUT) ) { # read in chunks of 1024. read(OUTPUT, $buffer, 1024); print GZIP_OUT $buffer; } close GZIP_OUT; select STDOUT; $| = 1; # make unbuffered while (<GZIP_IN>) { # some other stuff print STDOUT "$_"; } close GZIP_IN; close INPUT; unlink "$outfile"; unlink "$outfile$$"; }
    Line 74 is the open2 line.
      Yeah! If you have a compressed file of roughly more than 8K (depends on your system) gzip stops, your script stops and your processtable looses 2 slots. If no slots are there anymore you get the error. Why doesn't your script stop and only gzip beats me.