With all due respect, I think you should avoid ysth's guess about the matter -- his advice would obliterate the input file, and you don't want that (unless you have a backup copy somewhere).
The problem is that you need to do binmode on STDOUT, as well as on FILE (the input file handle):
#!/usr/local/bin/perl
chdir ("d:\\www\\mypage.net\\www\\test\\");
$zipfile = "zipfile.zip";
print "Content-type:application/x-zip-compressed\n\n";
open FILE, "< $zipfile";
binmode FILE;
binmode STDOUT; ## add this
$/ = undef; ## use slurp mode for input, too
print <FILE>;
# closing both FILE and STDOUT is automatic at end-of-script
update: If you're going to be dealing with really big zip files (I'm guessing 2 MB might be a practical limit), you might opt for fixed-length read/write instead of slurp-mode (taking the whole file into memory at once and printing it all at once). In any case, don't rely on the "default" value of $/ (input record separator), since there's no telling what sort of read/write sizes you'll get that way. Set $/ to a reference to a sensible integer value to control the number of bytes read from FILE:
$/ = \16386 # (d'oh!)
$/ = \16384; # 16 KB/read
while (<FILE>) {
print;
}
# It's not a problem if the last "record" read from FILE is <16 KB.
(another update: I agree with Jaap about the chdir thing.) |