Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot
 
PerlMonks  

Re: Caching Web Pages

by tadman (Prior)
on Aug 08, 2002 at 10:17 UTC ( [id://188547]=note: print w/replies, xml ) Need Help??


in reply to Caching Web Pages

One trick I've used is to hide the actual innards of the CGI system using something like mod_rewrite under Apache. This can blend parameters into the actual URL transparently, such as:
http://www.monkstore.com/product/412340/featureXYZ
This might actually be expanded using mod_rewrite into:
http://www.monkstore.com/product.cgi?part=412340&mode=featureXYZ
These re-mapped URLs don't look like CGI output, so they will be cached better. By "better" I merely mean that they look more like regular content and less like CGI output. In other words, to the end-user, they can't tell it's a CGI from the URL alone.

To fully effect this, you have to tweak some headers so that the page can be cached. I think this is the "expires" header, such as:
my $q = CGI->new(); print $q->header(-expires => '+9 days');
Or whatever you feel is an appropriate expiry date. This will probably require a bit of futzing to get right, especially in the URL department.

The final step would be to layer in something like Squid Cache on top of your Web server to actually do the caching. There's plenty of examples on how to do that, though, so when you get that far, it should be pretty straightforward.

The best part of this approach is you get to decide what's cached, and for how long. Every page is generated using the same interface, as well.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://188547]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others lurking in the Monastery: (3)
As of 2024-04-25 21:01 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found