http://qs321.pair.com?node_id=324346


in reply to File::Find memory leak

There's no such thing as a "garbage collection" module. Perl does its own garbage collection using reference counting and if something's getting lost there's not much you can do about it (aside from fixing the leaky code).

If you can't find and fix the leak you'll probably have to fork() a sub-process to do whatever leaks, pass the results up to the parent via a pipe or temp file and then exit() the child. When the child exits any memory it used will be reclaimed by the operating system. I've used this technique before with leaky Perl modules. Give it a try and post again if you have trouble.

-sam

PS: The above suggestion assumes you're working on a Unix system. I imagine things are different in Windows-land, where fork() is emulated with threads and exit() probably doesn't free memory.

Replies are listed 'Best First'.
Re: Re: File::Find memory leak
by crabbdean (Pilgrim) on Jan 27, 2004 at 04:21 UTC
    Thanks Sam, that was exactly my thinking. Great minds! If the fork doesn't work, a simpler and possible alternative is to write a main script that does all the logging and a second script is called each time it traverses a users directory which contains the "File::Find" module. This will keep it constantly freeing memory it uses. I'll let you know the results.

    The "perltodo" manual page says some garbage collection work is still to be done in future for perl.

    Thanks
    Dean