in reply to Question on fork and close.

Could it be that your memory (the @files Array) is exhausted? Or do you "some stuff" with the System (which is it?) where you allocate and never free Resources? Or is it an CGI which needs longer for 10000 files then for 20 and TIMEOUT occurs? Did you account for 20000 files need 500000 times the time which 20 files need for opening?

Replies are listed 'Best First'.
Re: Re: Question on fork and close.
by sbank (Initiate) on May 29, 2001 at 23:58 UTC

    I doubt that the @files Array is exhausted. So most likely it is my some stuff that is causing the problem. (This is not in a CGI environment. Just a straight script.)

    I'm guessing that this is the offending line.

    open2(\*GZIP_IN, \*GZIP_OUT, "$gzip -dc -q $outfile$$") or die "cannot + open2 $gzip: $!";

    Again, this should only be one fork (for the one file that gzip is acting upon). At least I think it should be just one fork. (Obviously it isn't. :) )

      Opening a pipe both from and to "gzip -dc" ? Perhaps if you are doing line-mode input from the compressed source, that's the problem:
      while (<WHEREEVERTHECOMPRESSEDDATAIS>) { # $_ could be potentially huge, if it doesn't # contain a newline somewhere soon print GZIP_IN; }
      See read() or sysread() if that's it. But then, that's just a guess...good luck.