in reply to Re: Performance oddity when splitting a huge file into an AoA
in thread Performance oddity when splitting a huge file into an AoA

Yeah, I'm running under "use autodie;" and it behaves the same no matter which file is the first one.
  • Comment on Re^2: Performance oddity when splitting a huge file into an AoA

Replies are listed 'Best First'.
Re^3: Performance oddity when splitting a huge file into an AoA
by roubi (Hermit) on May 03, 2009 at 14:39 UTC
    Okay. Since you are not getting much help so far, here is a second half-brained idea: maybe what you are seeing is related to the deallocation of the @orders array created during the previous run. You could test that theory by keeping those around and see if your timings change at all.
      Huh, you're right. When i make process_dump_file return a reference to the array and push that into an array outside, the benchmark suddenly looks like this:
      Extracted 308272 orders. CSV time: 3 wallclock secs ( 3.11 usr + 0.48 sys = 3.59 CPU) Extracted 301468 orders. CSV time: 7 wallclock secs ( 6.41 usr + 0.23 sys = 6.64 CPU) Extracted 316912 orders. CSV time: 7 wallclock secs ( 6.02 usr + 0.22 sys = 6.23 CPU) Extracted 426854 orders. CSV time: 8 wallclock secs ( 8.08 usr + 0.31 sys = 8.39 CPU) Duration: 37 wallclock secs (32.28 usr + 2.36 sys = 34.64 CPU)
      This is a bit weird, but already a lot better. Thanks for the suggestion. :)