Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change
 
PerlMonks  

Re: Memory usage and perl

by janx (Monk)
on Mar 15, 2003 at 00:44 UTC ( [id://243221]=note: print w/replies, xml ) Need Help??


in reply to Memory usage and perl

A really cheesy way of limiting the damage would be (if your algorithm allows you to do that, of course):

  • Monitor the memory usage of your program and note the point when it's getting critical. I once had this problem with the 2 gig limitation of allocatable memory. Stupid Linux ;-)
  • Then try to rewrite your program to exit with a certain exitcode just when the point (processed data rows or so) has been reached after which things get tricky.
  • Wrap your program in a little shell script which checks the exit code and calls your program again if applicable. Of course with the right parameters to tell it to skip the already processed data.

By doing so you limit the maximum memory usage of one invocation of your script.
Of course this only works if your algorithm/task at hand allows this kind of dividing the work.

This has been working pretty well for me. Of course, then you can always distribute the work over a load of machines, if you feel the urge. ;-)

janx

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://243221]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others surveying the Monastery: (4)
As of 2024-04-19 04:51 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found