in reply to Re: finding top 10 largest files
in thread finding top 10 largest files

I suppose you could use this too (if you had enough ports)....
find / -type f -print|xargs ls -l|sort -rnk5|head -10

cheers

tachyon

Replies are listed 'Best First'.
Re: Re: Re: finding top 10 largest files
by BrowserUk (Patriarch) on Feb 03, 2004 at 02:00 UTC
      Good job, but in that environment, I would just ask Clippy what my largest files were! :)

        Yeah, but then clippy will insist on connecting (as a server) to the internet and that will force a 22 MB WindowsUpdate and 3 installs and 4 re-boots :)


        Examine what is said, not who speaks.
        "Efficiency is intelligent laziness." -David Dunham
        "Think for yourself!" - Abigail
        Timing (and a little luck) are everything!

Re: finding top 10 largest files
by Abigail-II (Bishop) on Feb 03, 2004 at 01:55 UTC
    Yes, you could. I realized that, but didn't present it, because it's neither efficient memory wise (as you are processing all the file names at the same time), nor is it run-time efficient, as you are sorting all the file names, and that's Omega (N log N). The solution where you keep a sorted buffer takes linear time.

    If you want to factor in the amount of files (let's say k) to be reported, the solution you present takes O (N log N + k), while my solution takes O (k N). If you replace the array with a heap, you can reduce that to O (k + N log k). The used memory in your solution is O (N), and O (k) in my solution.

    All mentioned upperbounds are tight.

    Abigail