in reply to system "gzip $full_name" is taking more time
Someone has already mentioned checking that the file is not already a gzip file. I see you have a test for that, but your regular expression does not correctly test if the file name ends in '.gz', only if it contains '.gz'.
I suggest:
next if /\.gz$/;
Your test of the file age might be more readable if you use '-M' (see perldoc perlfunc).
As others suggest do some timing measurements. The fact that the directory has 90,000 files might slow down directory related operations, especially if it is network mounted. unlink and gzip could be affected by this as they perform directory updates.
If you comment out the gzip portion, how long does the unlinking take on these large directories.
Measure how long gzip takes with a typical file. Also, when running your script check the processes. Is there a single long running gzip? You should be able to come up with some rough order of magnitude figures (I assume you have examined typical directories and have an idea of the number and sizes of files).
Also consider whether these directories a overdue for cleanup. If so, the first run of your script may take a lot longer that future runs.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: system "gzip $full_name" is taking more time
by bulk88 (Priest) on Dec 08, 2013 at 04:11 UTC | |
by parv (Parson) on Dec 09, 2013 at 08:20 UTC |