more useful options | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
I understand your problem, I routinely have directories of around 45,000 text files.
The problem with timeouts can also be the Apache. I am not sure of the exact mechanism. But when I process large directories I send something to the server regularly to 'keep it awake' and keep it outputting - and thus not timing the CGI process out. That way I have single CGI scripts that run sometimes for 12 or 13 hours and work nicely. The other alternative is to have the CGI script launch the script doing all the work, in this way the timeout is no longer an issue, the script can do its work quietly in the background. Hope that helps... In reply to Re: Efficient processing of large directory
by jdtoronto
|
|