in reply to Re: Efficiency and Large Arrays
in thread Efficiency and Large Arrays

Yes it would be "expensive" but if you don't have the memory to keep all the records in, then sort is not only the faster way to go, it's the only sane way to go. Sorting doesn't require large amounts of memory (though simple sorting would), and perl could do the presorting itself. Anyway for small files presorting would be a waste of time-energy and time-cpu. However as the input files grow, simply sorting your input beforehand can do wonders.

Ciao,
Gryn

Replies are listed 'Best First'.
RE: Presort's Cost
by turnstep (Parson) on Jul 26, 2000 at 17:59 UTC

    Yes, simple sorting requires a lot of memory, but even a more complex sort requires that at the very least you read each record once. Why not just grab the information while you are there, as in my example program above? I agree that a sorted file is far better, and you'd probably want to get it sorted at some point, but I would not call it the only "sane" way! :)

      Oh yeah of course sorting is an expensive operation (but not too expensive). However, what I said was that if you didn't have enough memory to keep all of the records in then sorting is the only sane way to do it because otherwise you will either start thrashing or repeating work. I never do the sorting unless I start processing +1 meg files, and even then it depends on whether I care :) .

      Cheers,
      Gryn