in reply to Tracking Memory Leaks
Perl's memory should flat-line ... if you programmed it right. Some things to check:
- Are you continuously adding elements to a hash or array? For example, if you keep doing something like $foo[$i++] = $bar;, you keep telling Perl to increase the size of @foo.
- Do you have any circular references? They are a cause of true memory leaks.
- Are you scoping things as much as possible? Do you have any large global hashes/arrays that you could scope a little smaller?
- Are you constantly loading new classes/modules? I was on a project for a OO application that would dynamically load the classes it needed as it needed to. Every time that happened, the memory usage would increase, but only a fixed amount.
- Are you fooling around with deleting from %INC and @INC? I'm not sure what that would do, but it probably won't actually remove the initial require from RAM.
------ /me wants to be the brightest bulb in the chandelier!
Vote paco for President!
Re: Re: Tracking Memory Leaks
by Hrunting (Pilgrim) on Aug 15, 2001 at 01:24 UTC
|
Heh, well, we do all those things, but they should be controlled.
- Yes, in various places, but they should all be limited in one way or another (either they go out of scope at various points or they only have a limited number of elements in them).
- We do have one module that uses circular references, but using Weakref, that shouldn't be a problem.
- Yes, scoping on the project is very tight.
- This is the thing that I'm worried about. We're constantly loading new classes, dymanically created from database information. Now, theoretically, we have a finite number of classes, so the memory usage from loading these things should be finite itself (and that limit should be hit rather quickly). However, I'm not positive this is the case. Anyway know much about the finer points of this?
- No.
Thanks for the help.
| [reply] |
|
| [reply] |
|
How does the interpreter determine whether a lexical is no longer "used"? You have to explicitly set it to undef? That seems doesn't seem like too much a "feature". So my memory usage won't really top out until all my functions have been called and all my variables used, is that correct?
I didn't know that about AUTOLOAD(). What do you mean by 'allocate additional memory every time'? Does that mean that every time I effectively call AUTOLOAD() (In my case, once for each undeclared function as I use AUTOLOAD() to then declare the function), the interpreter allocates a chunk of memory, or just that everytime I load a module with an AUTOLOAD() in it, it will allocate a larger chunk of memory than it would for a regular module?
I know I miss out on compile-time optimizations for the system, but do those optimizations involve the re-use of resources, as opposed to the allocation of resources (which I know they involve)? I would expect that resource re-use would be a function of the running system, not the compile-time optimizations. If perl allocates memory for eval'd statements and AUTOLOAD()ed subroutines and then doesn't re-use it, that sounds like a pretty serious issue.
| [reply] |
|
|
|
We're constantly loading new classes, dymanically created from database information. Now, theoretically, we have a finite number of classes, so the memory usage from loading these things should be finite itself (and that limit should be hit rather quickly). However, I'm not positive this is the case. Anyway know much about the finer points of this?
Are you cleaning up the symbol tables after you're done using these new classes, or are they being re-used? We found in Class::Prototyped that each new class takes up around 1.5-2K of memory; more (of course) with methods. You can clean up the symbol table when you're done with code like this (assumes $package does not contain '::'):
no strict 'refs';
foreach my $key ( keys %{"$package\::"} ) {
delete ${"$package\::"}{$key};
}
# this only works because we're not a multi-level package:
delete( $main::{"$package\::"} );
| [reply] [d/l] |
|
|