in reply to Performance oddity when splitting a huge file into an AoA
You're running this under mod_perl or fastcgi? Cos I can't reproduce your findings using straight perl.
#! perl -sw use 5.010; use strict; use Time::HiRes qw[ time ];; sub x{ open my $fh, '<', shift or die $!; my @AoA; push @AoA, [ split ',' ] while <$fh>; close $fh; return scalar @AoA;; } for ( 1 .. 5 ) { my $start = time; printf "Records: %d in %.3f seconds\n", x( sprintf 'junk%d.dat', 1+ ($_ & 1) ), time() - $start; } __END__ c:\test>junk Records: 400000 in 5.884 seconds Records: 300000 in 4.752 seconds Records: 400000 in 4.599 seconds Records: 300000 in 3.473 seconds Records: 400000 in 4.569 seconds c:\test>junk Records: 400000 in 4.826 seconds Records: 300000 in 3.408 seconds Records: 400000 in 4.613 seconds Records: 300000 in 3.481 seconds Records: 400000 in 4.557 seconds
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Performance oddity when splitting a huge file into an AoA
by Xenofur (Monk) on May 05, 2009 at 21:05 UTC | |
by BrowserUk (Patriarch) on May 05, 2009 at 21:29 UTC | |
by Xenofur (Monk) on May 05, 2009 at 22:32 UTC | |
by BrowserUk (Patriarch) on May 06, 2009 at 08:27 UTC | |
by Xenofur (Monk) on May 07, 2009 at 09:40 UTC | |
|