Think about Loose Coupling | |
PerlMonks |
Re: Efficient processing of large directoryby jdtoronto (Prior) |
on Oct 02, 2003 at 17:28 UTC ( [id://295979]=note: print w/replies, xml ) | Need Help?? |
I understand your problem, I routinely have directories of around 45,000 text files.
The problem with timeouts can also be the Apache. I am not sure of the exact mechanism. But when I process large directories I send something to the server regularly to 'keep it awake' and keep it outputting - and thus not timing the CGI process out. That way I have single CGI scripts that run sometimes for 12 or 13 hours and work nicely. The other alternative is to have the CGI script launch the script doing all the work, in this way the timeout is no longer an issue, the script can do its work quietly in the background. Hope that helps...
In Section
Seekers of Perl Wisdom
|
|