in reply to Using fork for reading and processing a large file in ActiveState perl

How big is "huge"?

What are doing between reading the data in, and writing it out?

(I assume you must be doing something complex, because on my very ordinary system with a so-so disk, Perl can read & write 3GB/minute. Which would make your file at least 18 Terabytes.)


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
"I'd rather go naked than blow up my ass"
  • Comment on Re: Using fork for reading and processing a large file in ActiveState perl

Replies are listed 'Best First'.
Re^2: Using fork for reading and processing a large file in ActiveState perl
by vit (Friar) on Feb 24, 2010 at 19:02 UTC
    OK, let's put it this way.
    I want to read a small file into array once. Then in the loop I do http requests, process results and write a portion into the file. Each portion is small.

      So why say it a huge file? And what is stopping you?

        In the loop I do a lot of computations and write (append) results to a file.