"be consistent" | |
PerlMonks |
Re: Speeding up/parallelizing hundreds of HEAD requestsby eric256 (Parson) |
on Sep 17, 2007 at 20:17 UTC ( [id://639480]=note: print w/replies, xml ) | Need Help?? |
Instead of caching on a cron basis cache on a request basis. The first persons visit will be slow, but then for the next X hours they will be fast, then one user is slow. If the links arn't always the same then you might even get to distribute that load over multiple users. So then the flow is, check the database for a link, if it isn't there request it now and get the status of the links. If it is, check its expiration, if its expired then fetch it now. Now use the data in the database to render your page. (Database can be anything persistent between different connections to the web server. ___________ Eric Hodges
In Section
Seekers of Perl Wisdom
|
|