note
janx
A really cheesy way of limiting the damage would be (if your algorithm allows you to do that, of course):<p>
<ul>
<li>Monitor the memory usage of your program and note the point when it's getting critical. I once had this problem with the 2 gig limitation of allocatable memory. Stupid Linux ;-)
<li>Then try to rewrite your program to exit with a certain exitcode just when the point (processed data rows or so) has been reached after which things get tricky.
<li>Wrap your program in a little shell script which checks the exit code and calls your program again if applicable. Of course with the right parameters to tell it to skip the already processed data.
</ul>
<p>By doing so you limit the maximum memory usage of one invocation of your script.<br>
Of course this only works if your algorithm/task at hand allows this kind of dividing the work.
<p>
This has been working pretty well for me. Of course, then you can always distribute the work over a load of machines, if you feel the urge. ;-)
<p>
janx
243025
243025