TeraMarv has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks,

Hopefully you can help me with a problem.

I need to perform and achive from a database to disk (on windows) using the dbs archive utility. Normally this wouldn't be a problem only i don't have enough disk to hold the uncompressed archive.

The target file is referenced in the archive utility script like this:

FILE=filename

I can however map this filename to another using one of the tools environment variables, if that makes any difference?

I usually call the archive tool via a filehandle and here doc in perl. So do any of you holy men know of a way i can stream the data from my application straight into gzip (the backup utility writes to a file by default) or any other compression tool? I know a named pipe in Unix would do it but i don't think that it works quite the same in Win32::Pipe.

Many Thanks, TeraMarv.

Replies are listed 'Best First'.
Re: Stream data from application to Gzip
by Zaxo (Archbishop) on Mar 23, 2005 at 00:49 UTC

    With PerlIO (default in Perl 5.8+) you can use the PerlIO::gzip layer.

    use PerlIO::gzip; open local(*STDOUT), '>:gzip', $filename or die $!;
    Now everything you print will be gzipped and shoved into the file.

    Your "FILE=filename" snippit looks fishy, no variable sigils.

    After Compline,
    Zaxo

      Thanks for the reply Zaxo.

      "FILE=filename" is not Perl. This is the command required by my archive utility to specify the target file.

      I can map this filename to any other via a built in utility function. What i need to know is how i can read from the file 'filename' while it is being written to by the archive utility....any ideas? I will not be printing anything.....the archive utility produces a binary data stream.

      Thanks again,

      TeraMarv.

        I don't think it's even possible. You're asking to take a file that is currently open with uncompressed data being actively written to it and compress it while this is taking place. Even if there were some curious way of masking or proxying the data being written to that filehandle, it seems like an awfully long and painful way to go for your results.

        One way you could probably retool it for windows would be to use the windows version of zip.

        # Substitute whatever tool you use to dump your db pg_dump -Udbadmin dbname | zip dbname-backup -

        And then you could extract it with:

        unzip -c dbname-backup -

        --
        "This alcoholism thing, I think it's just clever propaganda produced by people who want you to buy more bottled water." -- pedestrianwolf