"be consistent" | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
One trick I've used is to hide the actual innards of the CGI system using something like mod_rewrite under Apache. This can blend parameters into the actual URL transparently, such as:
http://www.monkstore.com/product/412340/featureXYZThis might actually be expanded using mod_rewrite into: http://www.monkstore.com/product.cgi?part=412340&mode=featureXYZThese re-mapped URLs don't look like CGI output, so they will be cached better. By "better" I merely mean that they look more like regular content and less like CGI output. In other words, to the end-user, they can't tell it's a CGI from the URL alone. To fully effect this, you have to tweak some headers so that the page can be cached. I think this is the "expires" header, such as: Or whatever you feel is an appropriate expiry date. This will probably require a bit of futzing to get right, especially in the URL department. The final step would be to layer in something like Squid Cache on top of your Web server to actually do the caching. There's plenty of examples on how to do that, though, so when you get that far, it should be pretty straightforward. The best part of this approach is you get to decide what's cached, and for how long. Every page is generated using the same interface, as well. In reply to Re: Caching Web Pages
by tadman
|
|