in reply to Re: Re: Re: Page Expiration
in thread Page Expiration

I agree that PM should send the proper headers. That will fix the problem they have with PM. Now, what about the other >50% of the sites on the net that don't? Should their sysadmin (or the end user) be burdened with the task of contacting the admins of every site and asking them to fix/add the correct cache headers?

In a perfect world everyone's headers would be correct, then again in a perfect world nobody would use MS Word to generate HTML. :) Yes, the correct solution is that every site on the net should send the correct headers, HOWEVER the practical/realistic solution is to configure your cache. I would wager that it is possible to tell >90% of the time whether or not a URL is dynamic or not based upon it's construction and for that matter squid comes out of the box with a set of rules that have never needed to be tweaked.

/\/\averick
perl -l -e "eval pack('h*','072796e6470272f2c5f2c5166756279636b672');"

Replies are listed 'Best First'.
Re: (maverick) Re 4: Page Expiration
by buckaduck (Chaplain) on Nov 10, 2001 at 02:40 UTC
    I agree with everything you're saying here. A workaround is as good as a fix anyday -- when it's possible. Just bear in mind that not everyone is the sysadmin of their proxy server.

    In my case, I'd have to contact my site's IT manager and convince him that it's necessary. Then he must convince the corporate IT department manager. Then it gets passed down to the sysadmin. Who ignores the request because he wants to know why PerlMonks isn't just sending the correct headers to begin with.

    Maybe I'm cynical, but I've had a lot more experience with my corporate IT department than you have.

    buckaduck