Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:
Hey,
I need to read a large number (2300) of fairly large gzipped text files (2M on average), do something with the content of each and save the result (500K-1.5M) again as gzips (keeping the originals just in case they are later needed).
There are quite a few modules to work with gzipped data, and there is also the possibility of using command line tools via backticks. Is there any reason to prefer a module over the command line? Any specific module that is better than others? There aren't really enough ratings to make an informed choice...
I've been using bzip2 until now, but both the soft that creates the original files and Compress::Bzip2 were creating a lot of broken files, so I'd like to find a better solution.
Opinions would be appreciated :-)
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Recommendation to zip/unzip gzip files
by cavac (Prior) on Jul 03, 2012 at 11:13 UTC | |
Re: Recommendation to zip/unzip gzip files
by zentara (Cardinal) on Jul 03, 2012 at 13:40 UTC | |
Re: Recommendation to zip/unzip gzip files
by Anonymous Monk on Jul 03, 2012 at 10:46 UTC | |
by Anonymous Monk on Jul 03, 2012 at 11:30 UTC | |
Re: Recommendation to zip/unzip gzip files
by mrguy123 (Hermit) on Jul 03, 2012 at 13:21 UTC | |
Re: Recommendation to zip/unzip gzip files
by Anonymous Monk on Jul 03, 2012 at 11:02 UTC | |
by Anonymous Monk on Jul 03, 2012 at 11:08 UTC |