Beefy Boxes and Bandwidth Generously Provided by pair Networks
Clear questions and runnable code
get the best and fastest answer
 
PerlMonks  

Memory usage and perl

by Anonymous Monk
on Mar 14, 2003 at 13:44 UTC ( [id://243025]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

How can i delete perl stuff from memory. I've understood that commands like delete, shift and undef do these. But it doesn't seem to be so. Test code below (uses ~170M memory). I watched memory usage with gtop. What happens to memory after is has been removed from perl (shift/undef/delete etc)? Can it be reused by other apps? Perl itself might be able to reuse, but i haven't tested enough to make more certain points.

#!/usr/bin/perl # # Testing memory usage... use warnings; use strict; my (%hash,@array); my ($i,$j)=(0,0); $|=1; while ($i<2000) { while ($j<1000) { push (@{$array[$i]},$j); push (@{$hash{$i+1}},$j); $j++; } $i++; $j=0; print "\r$i"; } print "\nLoaded. Sleeping little.\n"; sleep 15; print "Copying for undef.\n"; my %new_hash=%hash; print "Deleting hash.\n"; $i=0; foreach (sort {$a<=>$b} keys %hash) { delete $hash{$_}; $i++; print "\r$i"; } print "\nDone, sleeping\n"; sleep 10; print "Shifting array.\n"; $i=0; foreach (@array) { shift @array; $i++; print "\r$i"; } print "\nDone. Sleeping.\n"; sleep 10; print "Undefing new_hash.\n"; undef %new_hash; print "Sleeping.\n"; sleep 10; print "Exit.\n"; exit 0;

Replies are listed 'Best First'.
Re: Memory usage and perl
by broquaint (Abbot) on Mar 14, 2003 at 13:52 UTC
    What happens to memory after is has been removed from perl (shift/undef/delete etc)?
    It is 'put back' into the memory pool perl uses when allocating memory. So when you undef something, the memory that it used becomes available again to be re-allocated.
    Can it be reused by other apps?
    No, only once the process has exitted will the memory be available to to other apps. This is an OS issue, not a perl issue. See. the recent Memory leak when using hash 'references' for more info on the topic.
    HTH

    _________
    broquaint

      To add to this:
      No, only once the process has exitted will the memory be available to to other apps.
      This is correct, but needs to be qualified. Physical memory (actual RAM) that was taken by the variables (or better, memory pages) that are no longer used can be reclaimed: the pages containing the unused variables get swapped out to disk if some other process needs physical memory ('in core' memory). But the variables do stay in swap. They are only removed from there when the Perl process exits.

      CU
      Robartes-

Re: Memory usage and perl
by Hena (Friar) on Mar 14, 2003 at 15:16 UTC
    I posted this, but forgot to login. So...

    Anyways i checked the other thread and to note i'm using RH with perl-5.6.1-34.99.6 rpm. So different OS.

    Well there was also mentioned garbage collector and memory pool for perl. So there is no way of clearing this "pool" from perl. Meaning if i have a perl script running longer time and at some point it takes up a lot of memory it will also keep this same memory until perl process itself is exited?

      What broquaint and robartes are trying to say is that the "memory pools" used by Perl are allocated in virtual memory while only what is known as the working set of memory pages is actually mapped into physical memory. Let me explain this a bit better...

      In *nix and in general, in any virtual memory-enabled OS I know, processes are allocated a chunk of virtual memory. This refers to memory that is known to belong to this process (and perhaps shared by others as well, but this is not relevant now).

      The pages of memory that compose this virtual memory need to be stored somewhere. So there are two places to do so: Your computer's memory (also called in core because old memory technologies used magnetic cores to store ones and zeros) and a secondary device such as a disk (also called paged-out or swapped memory).

      To accomplish this feat, the OS allocates some chunk of disk for this purpose. Some do that on demand, some do that as a pre-allocation but this is not too important for this matter. The fact is that seldom used memory ends up in a disk, not in your core memory. So, you can have a machine with say, 128M of RAM and a 2 Gb swap partition. Your Perl process could probably achieve 600M of virtual size, but have a much smaller working set (say, 60M), so this is the only chunk of its address space that ends up mapped in the core memory.

      None of the OSes I know give the process the capability of returning unused memory, so as broquaint said, this is an OS issue and not a Perl issue.

      Of course, all this is from the OS perspective. Perl's memory management have a more detailed view of what's going on, and simply marks chunks of memory as used / unused, reusing the previously unused memory blocks to keep from requesting more from the OS. This is a common strategy that works well in 99% of the practical scenarios you will find.

      I commonly run Perl tasks that can take up to 900M of virtual space, yet span a working set of a few tens of megabytes. If you add up the virtual spaces of a couple of them, you far exceed the real memory on the machine where this runs, so as you see, this is not a real problem. However, if your program lacks a property called locality of reference, this whole scheme will work against you. Your only solution there, short of rethinking your algorythm, is adding more real memory to your machine.

      Hope this long clarification is of some help...

      Best regards

      -lem, but some call me fokat

        fokat++ ... but just to nitpick mmap and certain mallocs (and under certain conditions) will return the memory to the system. Check out the Unix Programming FAQ section 1.12.

        -derby

Re: Memory usage and perl
by pg (Canon) on Mar 14, 2003 at 18:43 UTC
    Although the memory allocated will not be freed back to the system, before the program exit, unless using things like mmap, but the memory allocated will be recycled for the same program.

    The real concern, under this context, is not the amount of memory your program used, but whether the amount of memory it used seems reasonable to you. If yes, fine; if not there is a memory leak.

    Also how active and sensitive the garbage collector is, is an extremely important fact here. If it is not active and sensitive enough, it would intend to allocate more memory than needed, as from time to time, it will fail to realize that there is actually allocated but freed memory or should be freed available.
Re: Memory usage and perl
by pg (Canon) on Mar 14, 2003 at 19:06 UTC
    A more meaningful testing is, to put a big loop around your code, and let it loop say 1000 times. Monitor the memory usage, ideally, you should not see any radical memory usage jump, starting from the second loop. Because the memory allocated in the first loop should be reused, instead of allocate more memory.

Re: Memory usage and perl
by janx (Monk) on Mar 15, 2003 at 00:44 UTC
    A really cheesy way of limiting the damage would be (if your algorithm allows you to do that, of course):

    • Monitor the memory usage of your program and note the point when it's getting critical. I once had this problem with the 2 gig limitation of allocatable memory. Stupid Linux ;-)
    • Then try to rewrite your program to exit with a certain exitcode just when the point (processed data rows or so) has been reached after which things get tricky.
    • Wrap your program in a little shell script which checks the exit code and calls your program again if applicable. Of course with the right parameters to tell it to skip the already processed data.

    By doing so you limit the maximum memory usage of one invocation of your script.
    Of course this only works if your algorithm/task at hand allows this kind of dividing the work.

    This has been working pretty well for me. Of course, then you can always distribute the work over a load of machines, if you feel the urge. ;-)

    janx

Re: Memory usage and perl. (Cleaning a Package)
by gmpassos (Priest) on Mar 15, 2003 at 19:18 UTC
    Well, a good way to make your memory clean is to use strict, and declare every variable with my. Try to organize your code in subs too, since the variables used in each sub will be cleaned automatically by Perl. In orther words, try to not make global variables.

    For who want something better, like clean every thing inside a package, including the memory used to declare subs, try to use this code, that I have adapted to make it independent for this node. The code comes from the module HPL::Safe. Where it runs a script inside a Safe compartment and clean all the memory after each run. Note that a package cleaned with this can't be reused, you will get some memory leaks!

    #!/usr/bin/perl clean_pack('main::FOO'); ############## # CLEAN_PACK # ############## sub clean_pack { my ( $pack_name ) = @_ ; &undef_pack($pack_name,1) ; ## Clean fisrt variables to DETROY objec +ts. my @packs = (scan_packs($pack_name) , $pack_name) ; foreach my $packname ( reverse sort @packs ) { &undef_pack($packname +,\%NO_CLEAN) ;} return( 1 ) ; } ############## # UNDEF_PACK # ############## sub undef_pack { my ( $packname , $keep_base) = @_ ; $packname .= '::' unless $packname =~ /::$/ ; no strict "refs" ; my $package = *{$packname}{HASH} ; return unless defined $package ; foreach my $symb ( keys %$package ) { if ( $symb !~ /::$/ && $symb !~ /^(?:\@|_|-|\d|\]|\^[VO]?)$/ ) { undef *{$packname . $symb} ; } } undef *{$packname} if !$keep_base ; } ############## # SCAN_PACKS # ############## sub scan_packs { my ( $package ) = @_ ; my %packs = %{_symdump($package)} ; my @result ; my $prepend ; foreach my $pack (keys %packs){ push @result, map {"$prepend$_"} keys %{$packs{$pack}{$part} || {} +}; } return(@result) ; } ############ # _SYMDUMP # ############ sub _symdump { my(@packages) = @_ ; my($key,$val,$num,$pack,@todo,$tmp); my $result = {}; foreach $pack (@packages){ no strict; while (($key,$val) = each(%{*{"$pack\::"}})) { my $gotone = 0; local(*ENTRY) = $val; #### PACKAGE #### if (defined $val && defined *ENTRY{HASH} && $key =~ /::$/ && $key ne "main::" && $key ne "<none>::") { my($p) = $pack ne "main" ? "$pack\::" : ""; ($p .= $key) =~ s/::$//; $result->{$pack}{PACKAGES}{$p}++; $gotone++; push @todo, $p; } } } return (@todo) ? { %$result, %{_symdump(@todo)} } : $result ; } ####### # END # #######

    Graciliano M. P.
    "The creativity is the expression of the liberty".

Re: Memory usage and perl
by Courage (Parson) on Mar 17, 2003 at 09:42 UTC
    Whether or not memory is returned to OS depends on how your perl is compiled and is very platform-dependent.

    Perl on WinCE (and on Win32 AFAIK) returns it to OS unless you compiled it with perl own memory allocator.

    Courage, the Cowardly Dog

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://243025]
Approved by pfaut
Front-paged by broquaint
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others having a coffee break in the Monastery: (8)
As of 2024-04-19 08:57 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found