in reply to Re: RFC: Mod Perl compressed content
in thread RFC: Mod Perl compressed content

Disk space is a diminishing issue, but bandwidth is a growing one - why shovel more data across the network than necessary? For a heavily content oriented site as opposed to one where the bulk of bandwidth is consumed by binary downloads, the gains to be achieved by compressed delivery are impressive. Wouldn't you like being able to serve five times the number of monthly visitors with the same bandwidth bill?

Makeshifts last the longest.

Replies are listed 'Best First'.
Re^3: RFC: Mod Perl compressed content
by diotalevi (Canon) on Nov 30, 2002 at 21:15 UTC

    Eh. Different problem. My initial reaction was to think that there are better ways to solve the problem. My second reaction is that actually this might be rather nice. I've used mod_gzip for just the sort of thing you mention but now that I'm thinking of it... it'd be nice to have the data be pre-compressed and instead of spending CPU time compressing data for clients that support it, spend CPU time decompressing data for clients that don't. So then perhaps it'd be really nice if there were a CGI::Gzip which would do the same thing as CGI except handle compressed content nicely. I'm not going to spend the time on it but if someone else were then maybe PerlIO::Gzip could just swap in the right filtering as needed.

    Update: I searched using the wrong keywords. Using Apache::Gzip you get some other finds like Apache::Compress, Apache::Dynagzip, IO::Filter::gzip.

    __SIG__ use B; printf "You are here %08x\n", unpack "L!", unpack "P4", pack "L!", B::svref_2object(sub{})->OUTSIDE;