in reply to Re: Loading Large files eats away Memory
in thread Loading Large files eats away Memory

just a quick comment: there may be a problem with Tie::File if the file is somewhere far far away.. for all that we know, E: might be a mapped network drive that takes several seconds to respond..

while we can solve half the problem by deferring the writing of the tied file, we can't do much about the reading - and we certainly don't want to rely on having disk space nearby. In this case loading the entire file to memory might be the only reasonable solution...

  • Comment on Re^2: Loading Large files eats away Memory

Replies are listed 'Best First'.
Re^3: Loading Large files eats away Memory
by BrowserUk (Patriarch) on May 26, 2005 at 09:29 UTC

    In that case I would load the file into memory as a string (25MB+ alittle bit) and the open that string as a file using perl's "memory file" facility.

    I'd then pass the memory filehandle to Tie::File and have it take care of performing the indexing and seeking required to treat the string as an array of lines.

    If the file needs to be modified, it just requires that the file be rewound and a single write to update it when processing is finished.

    Using T:F's 'memory' option you can decide how much memory you have to trade for speed:

    #! perl -slw use strict; use Tie::File; open IN, '<:raw', $ARGV[ 0 ] or die $!; my $data = do{ local $/ = -s( $ARGV[ 0 ] ); <IN> }; close IN; open my $fh, '+<', \$data or die $!; tie my @lines, 'Tie::File', $fh, memory => 20_000_000; print for @lines[ 100_000, 200_000, 300_000, 400_000 ]; @lines[ 100_000, 200_000, 300_000, 400_000 ] = ( 'modified' ) x 4; print for @lines[ 100_000, 200_000, 300_000, 400_000 ]; <STDIN>; ## Approx 60 MB here. 25MB file + 20 MB I configured for Tie: +:File + overhead. __END__ P:\test>460532 bigfile.txt xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx modified modified modified modified

    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
    "Science is about questioning the status quo. Questioning authority".
    The "good enough" maybe good enough for the now, and perfection maybe unobtainable, but that should not preclude us from striving for perfection, when time, circumstance or desire allow.
      this doesn't work for me.. it takes up 56MB when we tie it, but then it seems the sequential access to 100_000 etc, eats memory along the way.. by the <STDIN> line my perl is on 95MB.. read and write seem to eat separately - if I'm evil and ask for $lines[-1] and then write to it, it gets up to 200MB - worse than splitting the slurp...

        How long are the lines in your file?


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
        "Science is about questioning the status quo. Questioning authority".
        The "good enough" maybe good enough for the now, and perfection maybe unobtainable, but that should not preclude us from striving for perfection, when time, circumstance or desire allow.