in reply to Reading from large files

What is a better method for this.

Hi zer, See the below comparison using Array::FileReader, Slurp, File::Content modules and do function

use Array::FileReader; use Slurp; use Benchmark 'cmpthese'; use File::Content; cmpthese(-1, { Array => 'tie @foo, Array::FileReader, "test.txt"', Slurp => 'my @array = Slurp::to_array("test.txt")', do => 'do {local $/, "test.txt"}||die ($!)', Content => 'my $o_fil = File::Content->new("test.txt")', }); __END__ Rate Slurp Content Array do Slurp 3.94/s -- -100% -100% -100% Content 997/s 25229% -- -82% -100% Array 5467/s 138774% 448% -- -99% do 466126/s 11839502% 46643% 8425% --

Capacity of test.txt file is nearly 800KB

Regards,
Velusamy R.


eval"print uc\"\\c$_\""for split'','j)@,/6%@0%2,`e@3!-9v2)/@|6%,53!-9@2~j';