in reply to Using fork for reading and processing a large file in ActiveState perl

See character-by-character in a huge file for how to split your file. Unless you have a real multicore computer, using threads won't speed you up, but if you do, break your file into chunks, and hand them off to your threads.

I'm not really a human, but I play one on earth.
Old Perl Programmer Haiku
  • Comment on Re: Using fork for reading and processing a large file in ActiveState perl

Replies are listed 'Best First'.
Re^2: Using fork for reading and processing a large file in ActiveState perl
by Anonymous Monk on Feb 25, 2010 at 14:31 UTC
    Dude, read the other posts before replying, you're answer is completely wrong for the OP's problem.

      Same is true for 6 of his last 10 posts.

      Dude, read the other posts before replying,

      I did, no one mentioned to him how to bring in his huge file, and effectively split it, in order to hand them off to his threads for the parallel-processing. The OP asked I am reading and processing a huge file and recording results to another file which takes hundreds of hours. I want to run this task in multithreads..

      How is it wrong to show how to get his input file split into bite sized chunks for his threads? I question whether you understand what needs to be done in an actual program. Maybe you didn't actually look at the link I provided? I showed him the various ways to achieve the first step needed for his code. See How to break up a long running process for some parallel processing usage.


      I'm not really a human, but I play one on earth.
      Old Perl Programmer Haiku
        I did

        Your reading skills are lacking. Here the OP says he needs to read a small file and write to a large file. The link you gave as well as the nonsense you posted ("threads don't help on a single-core processor, but multi-core machines magically reduce IO time") do not help with that problem.