Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Re: Efficient processing of large directory

by jdtoronto (Prior)
on Oct 02, 2003 at 17:28 UTC ( [id://295979]=note: print w/replies, xml ) Need Help??


in reply to Efficient processing of large directory

I understand your problem, I routinely have directories of around 45,000 text files.

The problem with timeouts can also be the Apache. I am not sure of the exact mechanism. But when I process large directories I send something to the server regularly to 'keep it awake' and keep it outputting - and thus not timing the CGI process out. That way I have single CGI scripts that run sometimes for 12 or 13 hours and work nicely.

The other alternative is to have the CGI script launch the script doing all the work, in this way the timeout is no longer an issue, the script can do its work quietly in the background.

Hope that helps...

jdtoronto

  • Comment on Re: Efficient processing of large directory

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://295979]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others musing on the Monastery: (9)
As of 2024-04-18 12:34 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found