My program needs to read multiple ZIP archives and construct a single ZIP output stream written to STDOUT that is eventually written to a network socket. As the input is already compressed ZIP files I am looking for a way to take the already-compressed files/members of the input ZIP and add them to a streaming output (to avoid decompressing and re-compressing needlessly). As for the streaming output, I need to write the output as the input is processed to 1) avoid holding all the compressed data in memory and 2) keep the data flowing to avoid downstream socket timeouts.
The Archive::Zip module has a mechanism to copy members from a ZIP archive to another archive via addMember(), presumably without decompressing. Archive::Zip can also write to a pipe/socket. But there is no incremental write that I can find; the entire archive would need to be constructed in memory (or a file) before writing, which can be way too big for the data I am processing.
The IO::Compress::Zip module is focused on generating the kind of streaming output I need. But there is no obvious way to add the already-compressed files in my input ZIP archives to the output stream.
I'm sure it would work to read the ZIP with Archive::Zip, decompress each file/member and then compress them and write them to STDOUT, but this would be an unnecessary waste of cycles and for the data sizes (gigabytes) I need to process could be a big load.
Is there any elegant way to accomplish this? Are there features of the two mention modules that I've missed?
In reply to merge file/members in multiple ZIP archives into streaming, single ZIP archive output by terse
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |