in reply to Huge files manipulation

I find it very surprising that sort end with an out of memory, as sort dates from the time memory was tiny compared to current times, and back then it was able to sort huge files - sort will use temporary files to avoid having to use more memory than the OS is able to give it.

Your problem sounds like a task 'sort -u' was made for.

Replies are listed 'Best First'.
Re^2: Huge files manipulation
by graff (Chancellor) on Nov 10, 2008 at 21:10 UTC
    Unix and gnu sort rely on having an adequate amount of available disk space on /tmp; if the OP is using a machine with insufficient free space on /tmp,  sort -u will fail.
      Only if you haven't told sort if can use another directory. (GNU) sort will either use the directory given with the -T option, the directory in the TMPDIR environment variable, or /tmp. In that order. The POSIX standard mentions that -T is undocumented, but existing in some implementations, and it encourages implementors to support the TMPDIR environment variable.
Re^2: Huge files manipulation
by rowdog (Curate) on Nov 10, 2008 at 22:49 UTC

    I'm not surprised at all. HPUX is not GNU/Linux. The GNU tools are much more flexible and built to overcome the limitations found in many standard UNIX tools.

    As an example, I remember using ls in a directory with many files (>64k, oops) and HP's ls just barfed. Of course GNU's ls has no such problem. If it's optional in the standard, GNU has it but HPUX probably doesn't.

    Disclaimer: I haven't run any "big iron" in over 5 years so things may have changed.