Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot

Re: File::Find memory leak

by BrowserUk (Patriarch)
on Jan 27, 2004 at 06:11 UTC ( #324360=note: print w/replies, xml ) Need Help??

in reply to File::Find memory leak

Using 5.8.2 (AS808) on XP, and processing a little over 200_000 files, I see a growth pattern of around 22k per iteration, or maybe 10 bytes per file.

If I fork each iteration of the search, the growth appears to be increased slightly to 31k/iter of 205428 files.

Doing a crude comparision of heap dumps taken before & after an iteration, it appears as if the leakage isn't due to something not being freed, but rather to fragmentation of the heap, as larger entities are freed and their space half re-used for smaller things, thereby requiring the heap to grow the next time the larger entity needs to be allocated.

Note: The comparision was very crude...with something like 12000 individual blocks on the heap, it had to be:)

Having the script exec itself after each iteration does stop the growth, but whether that is practical will depend upon the nature and design of your program.

Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
Timing (and a little luck) are everything!

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://324360]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others surveying the Monastery: (5)
As of 2022-05-23 15:12 GMT
Find Nodes?
    Voting Booth?
    Do you prefer to work remotely?

    Results (82 votes). Check out past polls.