There's more than one way to do things | |
PerlMonks |
Re: Predictive HTTP caching in Perlby kvale (Monsignor) |
on May 03, 2006 at 04:26 UTC ( [id://547054]=note: print w/replies, xml ) | Need Help?? |
For RSS feed, there will be little or no content to cache, so I'd see this approach as a lot of work for uncertain benefit.
Something that will work is parallelizing the retrieval of the pages/feeds. Create an application, say with Parallel::ForkManager, that creates multiple process, each one fetching one site and processing it. Then assemble the results from all the children into your composite feed. The time taken will be only a little longer than the slowest website/feed. -Mark
In Section
Seekers of Perl Wisdom
|
|