Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re: Caching Web Pages

by Dog and Pony (Priest)
on Aug 08, 2002 at 09:06 UTC ( [id://188534]=note: print w/replies, xml ) Need Help??


in reply to Caching Web Pages

Well, of course it is possible to do, but you have to think about some things...
  • If you are using a CGI program that takes any kinds of parameters, you will need to cache a copy for each of these.
  • How are you gonna determine when a change should occur? By an arbitrarily chosen timespan, or by detecting changes somewhere else?

If you have pages that are generated from say a database or some similar, and doesn't change all that frequently - by manual or timed updates, I'd suggest that you instead produce a new set of HTML pages from your DB (or what it is) upon every change instead. That is a lot more simple approach to reducing load. This assumes that your CGI's don't take any parameters etc, in which case a caching approach is so-so anyways.


You have moved into a dark place.
It is pitch black. You are likely to be eaten by a grue.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://188534]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others rifling through the Monastery: (3)
As of 2024-04-24 20:53 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found