Re: Re: finding top 10 largest files
by tachyon (Chancellor) on Feb 03, 2004 at 00:03 UTC
|
Purists may want to use File::Find instead of find.
Purists or Win32 users like the OP for example....
| [reply] |
|
| [reply] |
|
I suppose you could use this too (if you had enough ports)....
find / -type f -print|xargs ls -l|sort -rnk5|head -10
| [reply] [d/l] |
|
|
|
|
|
Please excuse me, for being 95% off topic, but maybe a search should reveal the following link. For me Perl is mainly commandline work and I love less, which, find, etc and their combinations.
Thank You.
With the exception of which, which has a better Perl equivalent called pwhich, you should try http://unxutils.sourceforge.net/ as standalone alternative for cygwin.
Quote from description:
Here are some ports of common GNU utilities to native Win32. In this context, native means the executables do only depend on the Microsoft C-runtime (msvcrt.dll) and not an emulation layer like that provided by Cygwin tools.
But nevertheless you run into problems with find and echo as they have DOS equivalents commands with the same name, but limited functionality. If you have Novell, you will hit e.g. ls, depending on your path.
And it came to pass that in time the Great God Om spake unto Brutha, the Chosen One: "Psst!"
(Terry Pratchett, Small Gods)
| [reply] |
|
You might have luck with this pipeline
du -a | sort -nr | head -10
which would give the top 10 files recursively found below the directory this command is issued from. The cross-platform GNU utilities would list the top-ten files.
| [reply] [d/l] |
Re: Re: finding top 10 largest files
by etcshadow (Priest) on Feb 03, 2004 at 02:09 UTC
|
Purists may want to use File::Find instead of find.
Also, people who don't like to write horrible crash-prone and security-holed scripts might like to use the -print0 action when backticking (or "open-to-pipe"ing) find commands, coupled, of course with local $/ = "\0";.
Oh, likewise, if you're finding into a xargs call... it's always a good idea to find ... -print0 | xargs -0 .... Anybody ever think that there should be a shellmonks?
------------
:Wq
Not an editor command: Wq
| [reply] [d/l] [select] |
|
Though I doubt any one wants to be known as a shellmonk, it would be very useful to have a few pages of "now that you know it Perl, how do you do it the hard way", just as reference :)
I don't know how many times I skip past a find or a grep and jump straight into Perl due to my general slackness and intolerance for the various idiosyncracies.
I also find that slackness causing me to use 'slocate' instead of 'find', but that's impatience and thus is a virtue :) Let's face it though. Perl is just easier.
| [reply] |
Re: Re: finding top 10 largest files
by graff (Chancellor) on Feb 03, 2004 at 02:47 UTC
|
Purists may want to use File::Find instead of find.
That is, purists with lots of time on their hands... Whenever I have tried to compare "find" vs. "File::Find", the perl module seems to take about 5 time longer than the compiled utility, in terms of wallclock time to execute.
| [reply] |
|
Not alone is find usually faster to execute than
File::Find, it takes less programmer time as well.
Abigail
| [reply] |