Personally, I would go about this differently: Instead of writing a file to disk, zipping it, and then re-reading it for download, you can do it all on the fly*. (If you did want to do it via temporary files like in your current script, please read my nodes on File::Temp examples and running external programs, as there are potential security and concurrency issues with your current script.)
IO::Compress::Zip is a core module, and you can use it to generate and output a ZIP file on the fly (see its docs for details):
use warnings; use strict; use IO::Compress::Zip qw/$ZipError/; my @lines = (qw/ Hello World Foo Bar /); my $eol = "\r\n"; binmode STDOUT; # just to play it safe my $z = IO::Compress::Zip->new('-', # STDOUT Name => "Filename.txt" ) or die "zip failed: $ZipError\n"; for my $line (@lines) { $z->print($line, $eol); } $z->close();
It's also possible to write the ZIP file to a scalar, e.g. if you need to know its length before writing it out, although that of course increases the memory usage. At the very least, you don't need to buffer the output lines like you're doing in your current script with @resp.
* OTOH, I agree with cavac that if these files are going to be unchanged across multiple downloads, it'd certainly be more efficient to not re-generate them on every request and use appropriate HTTP caching methods instead.
Update: Minor edits.
In reply to Re: Correct Perl settings for sending zipfile to browser
by haukex
in thread Correct Perl settings for sending zipfile to browser
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |