in reply to Re: sorting large data
in thread sorting large data

But how would you do this kind of sort if you don't have access to gnu sort? I have pondered this for a while and have not come up with any efficient solutions. Certainly loading up a 45MB text file into RAM is not the answer.

"Falling in love with map, one block at a time." - simeon2000

Replies are listed 'Best First'.
Re: sorting large data
by Abigail-II (Bishop) on Jul 23, 2002 at 16:32 UTC
    Well, if don't have access to GNU sort, you can always try one of the many other implementations of Unix sort.... ;-).

    Anyway, you would do as Unix sort would do. Split up the data in sizes that you can swallow (how much that is depends from system to system). Sort that, and store it in a temporary file. Now you have a bunch of sorted files - and you have to merge them. You even might have to do this recursively.

    Read Knuth if you want to know everything about merge sort.

    Abigail

      Short of going out and buying books (with a limited cash supply), is there anywhere on the web with this sort of information?
Re: Re: Re: sorting large data
by Fletch (Bishop) on Jul 23, 2002 at 16:29 UTC

    Any good algorithms book should cover sorting. See Knuth volume 2, or Orwant et al Mastering Algorithms with Perl.