in reply to RFC: Mod Perl compressed content

This sounds a job for a larger hard disc. Seriously - while I can imagine some potential solutions it all sounds like a lot of work that doesn't actually have to happen. It isn't as if disc space is expensive these days. If I were handed this task I'd much prefer to just solve this the right way, save some time and go for a walk instead. Priorities.

As for actual ideas ... you could do a mod_perl/cgi that does all your page serving for you. You'd then do the obvious thing - return compressed content for clients that support it and uncompressed for those that don't. It doesn't sound like you need rocket science or anything. Or a POE application that can keep some of the processed data cached.

__SIG__ use B; printf "You are here %08x\n", unpack "L!", unpack "P4", pack "L!", B::svref_2object(sub{})->OUTSIDE;

Replies are listed 'Best First'.
Re^2: RFC: Mod Perl compressed content
by Aristotle (Chancellor) on Nov 30, 2002 at 21:01 UTC
    Disk space is a diminishing issue, but bandwidth is a growing one - why shovel more data across the network than necessary? For a heavily content oriented site as opposed to one where the bulk of bandwidth is consumed by binary downloads, the gains to be achieved by compressed delivery are impressive. Wouldn't you like being able to serve five times the number of monthly visitors with the same bandwidth bill?

    Makeshifts last the longest.

      Eh. Different problem. My initial reaction was to think that there are better ways to solve the problem. My second reaction is that actually this might be rather nice. I've used mod_gzip for just the sort of thing you mention but now that I'm thinking of it... it'd be nice to have the data be pre-compressed and instead of spending CPU time compressing data for clients that support it, spend CPU time decompressing data for clients that don't. So then perhaps it'd be really nice if there were a CGI::Gzip which would do the same thing as CGI except handle compressed content nicely. I'm not going to spend the time on it but if someone else were then maybe PerlIO::Gzip could just swap in the right filtering as needed.

      Update: I searched using the wrong keywords. Using Apache::Gzip you get some other finds like Apache::Compress, Apache::Dynagzip, IO::Filter::gzip.

      __SIG__ use B; printf "You are here %08x\n", unpack "L!", unpack "P4", pack "L!", B::svref_2object(sub{})->OUTSIDE;
Re: Re: RFC: Mod Perl compressed content
by simon.proctor (Vicar) on Nov 30, 2002 at 20:56 UTC
    The point here is that the host costs me a certain amount per year. To upgrade the account would cost an additional £450 a year which is money I don't have. I have shopped around and these guys are very good and seeing as I have been with them for around 4 years I don't really want to move.

    So my priority is disk space and bandwidth.

    Thanks anyway.
Re: Re: RFC: Mod Perl compressed content
by belg4mit (Prior) on Dec 01, 2002 at 05:29 UTC
    It's actually a very reasonable task. Inflation is much cheaper than deflation, so this should be less intensive than mod_gzip. And while YMMV, based upon the stats given by the maintainers of mod_gzip (and in my own experience) the vast majority of user-agents can handle zipped content-encoding just fine.

    UPDATE: This should be rather easy as a handler, assuming content is zipped check to see if Accept-Encoding includes zip (or whatever the actual token is). If not use Archive::Zip or some such to inflate over the pipe on the fly.

    --
    I'm not belgian but I play one on TV.