in reply to Re: Large file, multi dimensional hash - out of memory
in thread Large file, multi dimensional hash - out of memory

thanks, it is a 64bit perl, but not long after passing the 4GB mark it runs out of memory anyway. I guess we need a bigger boat
  • Comment on Re^2: Large file, multi dimensional hash - out of memory

Replies are listed 'Best First'.
Re^3: Large file, multi dimensional hash - out of memory
by BrowserUk (Patriarch) on May 15, 2013 at 15:10 UTC

    As the input file is sorted, uniq infile > outfile ought to do the job very quickly.

    If for some reason you don't have the uniq command available, try

    #! perl -w use strict; my $last = <>; while( <> ) { print $last if $_ ne $last; $last = $_; } __END__ thisscript infile > outfile

    With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.