I have a little script that is running my computer out of memory, and I was wondering if there might be a way around this problem.
Someone a few days ago mentioned something called forking.
Whatever you do,
forking is probably
not going to fix and "out of memory" problem. Why? Because at the moment you fork, you make an
exact clone of your process. If your original process is using, say, 221MB of memory, then the child process will
also be using 221MB.
Is your script running in circles, and only running out of memory after a long time? Then a better strategy to think about is to regularly wipe your memory in the big loop. Either restart the program, by launching it again and then exiting; or undef your big data structures.