in reply to Parsing very big files GB

There's something else causing your problem, but you haven't shown us what it is. while (<$fh>) wil read the file line by line, and not all at once.

Are you putting your read lines into an array or a hash?

Replies are listed 'Best First'.
Re^2: Parsing very big files GB
by rootcho (Pilgrim) on Aug 30, 2007 at 23:46 UTC
    That was what I thought too ! To be sure I'm not leaking info..for all hashes, arrays and objects I do "undef var", after they are no longer needed.

      Show us more of the code. Sounds like some restructuring is in order. In particular, anything that you "undef" ought to be inside the while loop so that it is cleaned up when it goes out of scope at the end of the loop. Only variables that should retain content after the loop should be declared outside the loop.

      If you are not using strictures already I strongly recommend that you add use strict; use warnings; to your code.


      DWIM is Perl's answer to Gödel
        It was my error, I dont know exactly what because I changed alot of things..
        But now the memory usage is normal with open().
        definetly my error.
        still puzzled !
        thanx for the help.