in reply to Re: Problems reading large gz files with IO::Uncompress::Gunzip
in thread Problems reading large gz files with IO::Uncompress::Gunzip

The original question might be resolved: it appears that the *gz file I was using for testing is somehow corrupt (gunzip fails on it, reporting "unexpected end of file").

I still wonder whether there are any benefits to using the IO::Uncompress::Gunzip as opposed to a pipe. Generally, are there any caveats I should be aware of when dealing with large *gz files in Perl?.. Feel free to contribute bits of wisdom, but the original question is as of now moot.
  • Comment on Re^2: Problems reading large gz files with IO::Uncompress::Gunzip

Replies are listed 'Best First'.
Re^3: Problems reading large gz files with IO::Uncompress::Gunzip
by NetWallah (Canon) on Dec 30, 2008 at 05:10 UTC
    Sorry - I have not had the need to explore IO::Uncompress::Gunzip, so I cannot offer direct experience or advice.
    It just seemed like a worthwhile option to explore when you were up against limited memory and large files.
    Hopefully our brethren ans sisteren here have profounder experiences to offer.

         ..to maintain is to slowly feel your soul, sanity and sentience ebb away as you become one with the Evil.

Re^3: Problems reading large gz files with IO::Uncompress::Gunzip
by DStaal (Chaplain) on Dec 30, 2008 at 13:48 UTC

    I think the biggest benefit is when you pair it with IO::Uncompress::AnyUncompress: At that point you can treat compressed and uncompressed files the same. A single open statement will open either, transparently.