in reply to File > 2G under linux

not sure about the filesystem issues, but you could try just uncompressing the file to STDOUT, read that line-by-line and do your parsing, and write to a pipe that compresses. I have done this using gzip with success (not using large files, just in general). something like this (UNTESTED):

open(INPUT, "/usr/bin/gzip -d -c '$filename' |") open(OUTPUT, "| /usr/bin/gzip > '$filename'"); while(<INPUT>) { # munge print OUTPUT; } close INPUT; close OUTPUT;

You could also look at Compress::ZLib, which might work for you...

--
3dan

Replies are listed 'Best First'.
Re: Re: File > 2G under linux
by MidLifeXis (Monsignor) on Sep 11, 2003 at 17:12 UTC

    It was written:

    open(INPUT, "/usr/bin/gzip -d -c '$filename' |") open(OUTPUT, "| /usr/bin/gzip > '$filename'");

    Make sure that you use a different value for $filename on each of these calls, or you (may|will) clobber the contents of the file you are trying to read, that is, unless you are using an OS with versioned files.

    --MidLifeXis

      Quite right. Goog thing I included the 'UNTESTED' disclaimer! :-) I cut and paste some code from different places, one of which did the reading and the other writing - I was solving a different problem than the one posed here. Good eye!

      --
      3dan