firmament has asked for the wisdom of the Perl Monks concerning the following question:
Hello,
I'm using the following primitive script to take a look at a particular line in a very large (3 gb) file. It takes forever (more than 30-40 minutes) though, so I'm wondering if anyone has any suggestions as to how to make this faster?
#!/usr/bin/perl + use Tie::File; + + $infile = '/path/to/myfile.xml'; tie @array, 'Tie::File', $infile or die $!; + print $array[64366480];
Doing the equivalent in AWK took 3 minutes, but as I need to expand this I'd prefer if I could stay with Perl.
On a sidenote I ran out of memory if I used a simple while loop, which surprised me as that shouldn't slurp the file as say a foreach?
Thanks in advance.
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Speed and memory issue with large files
by BrowserUk (Patriarch) on Mar 19, 2010 at 16:44 UTC | |
by ikegami (Patriarch) on Mar 19, 2010 at 17:21 UTC | |
by BrowserUk (Patriarch) on Mar 19, 2010 at 19:53 UTC | |
| |
by firmament (Novice) on Mar 19, 2010 at 16:53 UTC | |
Re: Speed and memory issue with large files
by toolic (Bishop) on Mar 19, 2010 at 17:05 UTC | |
Re: Speed and memory issue with large files
by eff_i_g (Curate) on Mar 19, 2010 at 17:18 UTC |