in reply to Memory usage and perl
A really cheesy way of limiting the damage would be (if your algorithm allows you to do that, of course):
- Monitor the memory usage of your program and note the point when it's getting critical. I once had this problem with the 2 gig limitation of allocatable memory. Stupid Linux ;-)
- Then try to rewrite your program to exit with a certain exitcode just when the point (processed data rows or so) has been reached after which things get tricky.
- Wrap your program in a little shell script which checks the exit code and calls your program again if applicable. Of course with the right parameters to tell it to skip the already processed data.
By doing so you limit the maximum memory usage of one invocation of your script.
Of course this only works if your algorithm/task at hand allows this kind of dividing the work.
This has been working pretty well for me. Of course, then you can always distribute the work over a load of machines, if you feel the urge. ;-)
janx
In Section
Seekers of Perl Wisdom