in reply to Re: Page Expiration
in thread Page Expiration

I run the squid proxy server at home and it has a configuration option to specify, based upon a url regexp, what things to cache or not to cache. I would suspect that most decent proxy programs have a way to do this. I'd suggest having your sysadmin look into this, because odds are, PM isn't the only site that your proxy will over aggressively cache.

/\/\averick
perl -l -e "eval pack('h*','072796e6470272f2c5f2c5166756279636b672');"

Replies are listed 'Best First'.
Re: Re: Re: Page Expiration
by tomhukins (Curate) on Nov 10, 2001 at 00:39 UTC

    Perl Monks isn't the only site that doesn't deal with Web caches properly, but that's no reason for it to violate HTTP standards.

    If an HTTP response makes no statement as to what a cache should do with it (no Last-Modified, Expires or Cache-Control), the document is considered cacheable.

    I disagree that cache administrators should be held responsible for creating workarounds for non-compliant sites. Web site/application developers should be responsible for ensuring their code works properly.

      I agree that PM should send the proper headers. That will fix the problem they have with PM. Now, what about the other >50% of the sites on the net that don't? Should their sysadmin (or the end user) be burdened with the task of contacting the admins of every site and asking them to fix/add the correct cache headers?

      In a perfect world everyone's headers would be correct, then again in a perfect world nobody would use MS Word to generate HTML. :) Yes, the correct solution is that every site on the net should send the correct headers, HOWEVER the practical/realistic solution is to configure your cache. I would wager that it is possible to tell >90% of the time whether or not a URL is dynamic or not based upon it's construction and for that matter squid comes out of the box with a set of rules that have never needed to be tweaked.

      /\/\averick
      perl -l -e "eval pack('h*','072796e6470272f2c5f2c5166756279636b672');"

        I agree with everything you're saying here. A workaround is as good as a fix anyday -- when it's possible. Just bear in mind that not everyone is the sysadmin of their proxy server.

        In my case, I'd have to contact my site's IT manager and convince him that it's necessary. Then he must convince the corporate IT department manager. Then it gets passed down to the sysadmin. Who ignores the request because he wants to know why PerlMonks isn't just sending the correct headers to begin with.

        Maybe I'm cynical, but I've had a lot more experience with my corporate IT department than you have.

        buckaduck