If there are no backups (ooooh, living on the edge, are we?), or if you don't have easy access to backup logs or database, look for the the unix "find" utility -- it will do everything you want, once you learn the command line usage. (There is a windows port of the tool, if that's what you need.) You could write the equivalent tool in Perl, using the File::Find module, but it'll be more work to set it up, it'll run slower, and it'll consume more system resources while it's running.
(Update: tachyon has just disproved the part about it being more work to set up -- or at least, the point is moot, since he's done the work; but it's still true that File::Find requires more run-time and memory than doing the equivalent job with "find".)
For a relevant discussion of using "find" with Perl (which can be easy, fast and effective), check out this snippet (shameless plug): An alternative to File::Find
In reply to Re: Parsing current and sub-directories and prints out all files found from largest size to smallest
by graff
in thread Parsing current and sub-directories and prints out all files found from largest size to smallest
by TASdvlper
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |