Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

Re: CGI Caching question

by Masem (Monsignor)
on Apr 22, 2001 at 06:13 UTC ( [id://74507]=note: print w/replies, xml ) Need Help??


in reply to CGI Caching question

According to the standard, you should have in your HTTP headers an Expires line that should point to negative time to prevent any cache. Fortunately, if you are using CGI.pm, adding -expires=>'-1m' in the header() function will do this without problems. Be warned, of course, that these must be respected by the browser for it to work, and a homegrown browser or someone using something like LWP doesn't have to pay attention to said Expires; you may want to make sure you use session ids to track the user and prevent them from revisiting any part of your script that you are trying to prevent caching of.


Dr. Michael K. Neylon - mneylon-pm@masemware.com || "You've left the lens cap of your mind on again, Pinky" - The Brain

Replies are listed 'Best First'.
Re: Re: CGI Caching question
by asiufy (Monk) on Apr 22, 2001 at 22:58 UTC
    To provide redundancy, use the <META> tag in your HTML. Check here for info:
    http://www.htmlhelp.com/reference/wilbur/head/meta.html
Re: Re: CGI Caching question
by chorg (Monk) on Apr 22, 2001 at 06:18 UTC
    Will that apply to proxy servers as well? I want to prvent caching on all levels...
    _______________________________________________
    "Intelligence is a tool used achieve goals, however goals are not always chosen wisely..."
      Unfortunately, you can't guarentee that at all; the Expires header is defined in the standard, and therefore, any proxy should follow it, but as with users and homegrown browsers, they don't have to. I know that in recent discussions on my isp's newsgroups on the possibly of installing a proxy, most proxies that are used at a large scale site are home grown and typically have had much trouble with 80% of the web sites out there that *don't* follow the standard.

      Again, falling back on a sessionid and other tracer should help prevent problems from cached page use.


      Dr. Michael K. Neylon - mneylon-pm@masemware.com || "You've left the lens cap of your mind on again, Pinky" - The Brain

        You wrote that those homegrown proxies had trouble with websites because the websites don't follow the standard?

        Are you sure you don't mean it the other way around, that the proxies have problems because they don't follow the standard?

        Please elaborate!

      I want to prvent caching on all levels...

      Remember, you can't prevent anything. This is the #1 thing that web creators seem to be not able to understand: Once that page leaves your server, it's outta your hands. You don't know who's going to render it, or cache it, or index it, or whatever.

      Basically, my point is, the cache things are suggestions at best. Nobody enforces them, and Lord only knows what mutant and/or broken browsers might be out there.

      xoxo,
      Andy

      # Andy Lester  http://www.petdance.com  AIM:petdance
      %_=split';','.; Perl ;@;st a;m;ker;p;not;o;hac;t;her;y;ju';
      print map $_{$_}, split //,
      'andy@petdance.com'
      

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://74507]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others about the Monastery: (3)
As of 2024-04-25 14:22 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found