I Tried using IO::Compress::Gzip but it is not replacing original file but creating another compressed file. So I have to make another system call for removing original file. And this will not improve the performance.
I Cant create one compressed file for all. I have to follow some rules and standard in my production environment.
I am thinking to run 5-6 copies of same script. Dividing 120 directories between them.
In reply to Re^2: system "gzip $full_name" is taking more time
by dushyant
in thread system "gzip $full_name" is taking more time
by dushyant
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |