Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Poorly written script

by Baffled (Acolyte)
on Feb 11, 2008 at 04:09 UTC ( [id://667311]=perlquestion: print w/replies, xml ) Need Help??

Baffled has asked for the wisdom of the Perl Monks concerning the following question:

Im a newbie with scripting, and learn fast, (sorta) however Im far from understanding enough to optimize for memory leaks, in fact im finding it hard to actually get alot of info on the subject. Till I found your amazing site. Im hoping that some of your perl gurus will take a quick moment to look over this very very short snippet of code and see if anything glars as a memory leak, according to my hosting service: "This script is periodically consuming all 2GB of your RAM. This causes the system to then use harddrive space for additional RAM or commonly known as SWAP space. This then is consumed as well. Once this occurs, their is no RAM left on the box to handle existing processes. The CPU tries to handle the load overheating and then logging events to the system log before either crashing the system or shutting down to protect itself." The following is the code in question:
#!/usr/bin/perl use Carp; use vars qw(%config %category %form %super); require "/var/www/vhosts/mysite.com/cgi-bin/categories.cgi"; $config{'basepath'} = '/var/www/vhosts/mysite.com/cgi-bin/'; $config{'bluedir'} = 'register'; my $key; my $numusers = 1; opendir THEDIR, "$config{'basepath'}$config{'bluedir'}"; my @allfiles = grep -T, map "$config{'basepath'}$config{'bluedir'}/$_" +, readdir THEDIR; closedir THEDIR; $numusers = $numusers + @allfiles; my $totalfiles = 100; foreach $key (sort keys %category) { opendir THEDIR, "$config{'basepath'}$key"; my @allfiles = grep -T, map "$config{'basepath'}$key/$_", readdir THED +IR; closedir THEDIR; $totalfiles = $totalfiles + @allfiles; } #################################### print "Content-type: text/html\n\n"; #print "<center><font face=arial size=2><b>There are currently<font co +lor=FF0000> $numusers</font> registered users and<font color=FF0000> +$totalfiles</font><font color=000000> open auctions.</font></b></cent +er>\n"; print <<"EOF"; <div align=center> <center> <table border=0 cellpadding=0 cellspacing=0 width=100\%> <tr> <td width=100\% valign=middle align=center height=15></td> </tr> <tr> <td width=100\% valign=middle align=center height=20><font face= +Arial size=2><b>There are currently:</b></font></td> </tr> <tr> <td width=100\% valign=middle align=center height=15></td> </tr> <tr> <td width=100\% valign=middle align=center><img src=/images/user +s.gif> &nbsp;&nbsp;<font face=Arial size=2><b><font color=#FF0000>$nu +musers</font> registered users.</b></font></td> </tr> <tr> <td width=100\% valign=middle align=center height=0></td> </tr> <tr> <td width=100\% valign=middle align=center><img src=/images/auci +.gif> &nbsp;&nbsp;<font face=Arial size=2><b><font color=#FF0000>$tot +alfiles</font> open auctions.</b></font></td> </tr> <tr> <td width=100\% valign=middle align=center height=15></td> </tr> </table> </center> </div> EOF 1;

Replies are listed 'Best First'.
Re: Poorly written script
by Thelonius (Priest) on Feb 11, 2008 at 05:17 UTC
    It hard to see how it could use quite so much memory, but there are a couple of obvious ways to reduce what it does use.

    First, you are building an array to hold all the names of the text files in each directory, when all you need is the count. So you should just count in a loop. I notice that the readdir documentation in perlfunc only shows an example of readdir in list context, but for this purpose, a loop is better. If there is some huge directory with 100,000 files, that could use a lot of memory, but that seems unlikely.

    Second, although this is probably a minor point, where you say foreach $key (sort keys %category), the sort is causing an array of all the keys to be built. Since this is unnecessary for this use, you can just leave the sort out.

    #!/usr/bin/perl use Carp; use strict; use vars qw(%config %category %form %super); require "/var/www/vhosts/mysite.com/cgi-bin/categories.cgi"; $config{'basepath'} = '/var/www/vhosts/mysite.com/cgi-bin/'; $config{'bluedir'} = 'register'; sub count_text_files { my ($dirname) = @_; my $result = 0; if (!opendir DIR, $dirname) { carp "Cannot open $dirname: $!\n"; return 0; } while (defined(my $file = readdir DIR)) { if (-T "$dirname/$file") { ++$result; } } close DIR; return $result; } my $key; my $numusers = 1; $numusers += count_text_files("$config{'basepath'}$config{'bluedir'}") +; my $totalfiles = 100; foreach $key (keys %category) { $totalfiles += count_text_files("$config{'basepath'}$key"); }
      Thank you so much Thelonius, your totally awesome. Your code worked perfectly, I wast sure if I was supposed to replace the $dirname with something or not. I tried it like it is first and it worked first shot. You really really came to my rescue in a way I never thought of. Solved that problem. I'm torn between asking another question, I dont want to be greedy or a mouch, but being such a novice that I am, is there any chance I could post another snippet of code for you to review, its a upload.cgi that has also been deemed a major drain on memory, especially when more than one user is hits it like on friday nights. What gets me is that since I went to the dedicated server Ive been using this same script for several years without a hiccup, but I guess now adays we have many more users and much more busier and all that makes the system have to do much more processing.
        Well, just post it to Seekers. I'm going to bed soon, but somebody'll probably reply.
Re: Poorly written script
by andreas1234567 (Vicar) on Feb 11, 2008 at 07:26 UTC
Re: Poorly written script
by ysth (Canon) on Feb 11, 2008 at 04:34 UTC
    Is the grep -T needed? It has to open each and every user or auction file, and read enough of them to guess whether it's a text or binary file. If you can determine this from the filename alone, you would be much better off.
Re: Poorly written script
by plobsing (Friar) on Feb 11, 2008 at 05:12 UTC
    Twice you create a list @allfiles only to get its length. You could get the same functionality by iterating through in stead of the grep -T, map .... This would take the space of 1 integer (for the running count) in stead of an entire array.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://667311]
Approved by kyle
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others surveying the Monastery: (2)
As of 2024-04-24 23:04 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found