in reply to Re^2: Loading Large files eats away Memory
in thread Loading Large files eats away Memory
In that case I would load the file into memory as a string (25MB+ alittle bit) and the open that string as a file using perl's "memory file" facility.
I'd then pass the memory filehandle to Tie::File and have it take care of performing the indexing and seeking required to treat the string as an array of lines.
If the file needs to be modified, it just requires that the file be rewound and a single write to update it when processing is finished.
Using T:F's 'memory' option you can decide how much memory you have to trade for speed:
#! perl -slw use strict; use Tie::File; open IN, '<:raw', $ARGV[ 0 ] or die $!; my $data = do{ local $/ = -s( $ARGV[ 0 ] ); <IN> }; close IN; open my $fh, '+<', \$data or die $!; tie my @lines, 'Tie::File', $fh, memory => 20_000_000; print for @lines[ 100_000, 200_000, 300_000, 400_000 ]; @lines[ 100_000, 200_000, 300_000, 400_000 ] = ( 'modified' ) x 4; print for @lines[ 100_000, 200_000, 300_000, 400_000 ]; <STDIN>; ## Approx 60 MB here. 25MB file + 20 MB I configured for Tie: +:File + overhead. __END__ P:\test>460532 bigfile.txt xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx modified modified modified modified
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^4: Loading Large files eats away Memory
by ivancho (Hermit) on May 26, 2005 at 10:37 UTC | |
by BrowserUk (Patriarch) on May 27, 2005 at 07:03 UTC | |
by ivancho (Hermit) on May 27, 2005 at 21:10 UTC |