in reply to Re: Re: finding top 10 largest files
in thread finding top 10 largest files

Oh, come on. You must know by now that the standard Unix tools have been ported to Windows, multiple times?

There's no reason to feel left out if you're on a Windows platform, and someone uses 'find'.

Abigail

Replies are listed 'Best First'.
Re: Re: finding top 10 largest files
by tachyon (Chancellor) on Feb 03, 2004 at 01:45 UTC
    I suppose you could use this too (if you had enough ports)....
    find / -type f -print|xargs ls -l|sort -rnk5|head -10

    cheers

    tachyon

        Good job, but in that environment, I would just ask Clippy what my largest files were! :)
      Yes, you could. I realized that, but didn't present it, because it's neither efficient memory wise (as you are processing all the file names at the same time), nor is it run-time efficient, as you are sorting all the file names, and that's Omega (N log N). The solution where you keep a sorted buffer takes linear time.

      If you want to factor in the amount of files (let's say k) to be reported, the solution you present takes O (N log N + k), while my solution takes O (k N). If you replace the array with a heap, you can reduce that to O (k + N log k). The used memory in your solution is O (N), and O (k) in my solution.

      All mentioned upperbounds are tight.

      Abigail