Beefy Boxes and Bandwidth Generously Provided by pair Networks
Syntactic Confectionery Delight

Re: Memory usage and perl

by janx (Monk)
on Mar 15, 2003 at 00:44 UTC ( #243221=note: print w/replies, xml ) Need Help??

in reply to Memory usage and perl

A really cheesy way of limiting the damage would be (if your algorithm allows you to do that, of course):

  • Monitor the memory usage of your program and note the point when it's getting critical. I once had this problem with the 2 gig limitation of allocatable memory. Stupid Linux ;-)
  • Then try to rewrite your program to exit with a certain exitcode just when the point (processed data rows or so) has been reached after which things get tricky.
  • Wrap your program in a little shell script which checks the exit code and calls your program again if applicable. Of course with the right parameters to tell it to skip the already processed data.

By doing so you limit the maximum memory usage of one invocation of your script.
Of course this only works if your algorithm/task at hand allows this kind of dividing the work.

This has been working pretty well for me. Of course, then you can always distribute the work over a load of machines, if you feel the urge. ;-)


Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://243221]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others romping around the Monastery: (4)
As of 2023-04-01 14:20 GMT
Find Nodes?
    Voting Booth?

    No recent polls found