OK, I removed the utf8 thing on the writes/reads (they were just a result of guessing to find the bug, anyway...)
But actually the problem is getting esoteric now - sometimes it works sometimes it don't!
I wrote a test script to get always the same page, write it to disk, read it back and compare both. The data is the same length before and after store/read (I don't know how to compare them bitwise).
Then I thought it has to do with me writing to disk and reading back too fast. First I tried a sleep() in between. Then I threw in File::Sync in the write routine to be sure:
use File::Sync qw(fsync sync);
open(FILE ,">$file") or die "err: $!";
binmode FILE;
print FILE $stored_as_file;
fsync(\*FILE) or die "fsync: $!";
close(FILE);
sync();
still, the script randomly gets it right, then don't.
Another guess is that it has to do with Compress::Zlib. I am using it in the read/write and HTTP::Response uses it in the decoded_content() method to decode the gzipped http content. Is it possible that gzip does not flush between uses of the memGzip/memGunzip sub?
My last guess is an observation. The partly gzipped content that ends up in the header is always cut at a certain series of characters. Maybe the gzipped content does change from get() to get() and my script breaks when a certain char is in it!? (Would explain the irregular behaviour. Maybe HTTP:Response::parse() wrongly splits the binary compressed octets of the gzipped part of the http message and so a part of it ends up in the header. Possible? | [reply] [d/l] |