in reply to Re^5: Sorting a (very) large file (better*2)
in thread Sorting a (very) large file

Damn, that didn't work at all. After creating a suitably sized file with:

   perl -le 'for (0 .. shift()) { $num = int rand(100_000); print qq{12-25-2005\t12:30 PM\t$num\tC:\\someplace\\somefile.txt}; }' 8500000 > file.txt

I found that just loading the file into memory took up 1.7gb on my system! I tried a few variations without getting anything reasonable. I haven't really done a lot of Perl dev on this box - it's a Fedora 8 machine with the Fedora-built v5.8.8. Could be there's something wrong with it.

-sam

Replies are listed 'Best First'.
Re^7: Sorting a (very) large file (better*2)
by salva (Canon) on Dec 01, 2007 at 16:43 UTC
    this script...
    $| = 1; my @data; $#data = 8500000; $#data = -1; while(<>) { push @data, $_ } print "sorting...\n"; use Sort::Key qw(ukeysort_inplace); ukeysort_inplace { (split(/\t/))[2] } @data; print "sorted\n"; sleep 100;
    ... uses 800 MB for loading the data and then 50MB more for sorting!
      Good call - I should have thought to pre-extend the array!

      -sam

      This made me laugh out loud. The main difference without pre-extending is a (likely) hole in the heap of size 2**(n-1) left just after the array size was doubled to make it big enough to hold the entire file contents and the slop on the end of the array between the number of lines and 2**n (the final size of the array). In other words, two "dead" spans of virtual memory that will (at least mostly) remain untouched during the sorting and so will cause no page faulting and won't slow down the sorting at all (and will only moderately slow down the loading of the file contents into memory).

      So pre-extending shows that the lines being sorted can fit within 800MB of memory and so can be directly sorted without much if any page faulting. It cracks me up that samtregar threw up his hands in defeat thinking there's no point in trying to sort what won't fit in memory when the only extra page faulting caused by the process's virtual size exceeding physical memory had already taken place as part of filling the array. The rest of the sorting would only require pages totaling about 800MB.

      I also found it amusing that Sort::Key is doing pretty much exactly what I did except replacing my 3 lines of Perl code with a big pile of complex XS code.

      And then there is the problem of pre-extending requiring the file to be read twice which someone couldn't discount the cost of unless they are overlooking how much slower disk is than memory.

      Of course, condar may need to allocate a bit more paging space. I can't tell how much of his running out of virtual memory space was due to memory wasted by the ST or just due to having way too little paging space configured.

      - tye