Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid
 
PerlMonks  

Re: Re: Checking Perl script on server

by waswas-fng (Curate)
on Jul 30, 2003 at 13:55 UTC ( #279181=note: print w/replies, xml ) Need Help??


in reply to Re: Checking Perl script on server
in thread Checking Perl script on server

Thereare also mpstat (view stats on a per cpu basis) and iostat (to see io activity). You can see where the the system is doing more work by looking at usr/sys/idle/wt. usr is userland stuff, sys is any kernel related activity (semephore, context switching, memory allocation etc) wt is io wait (physical io blocking). and idle is the amount the cpu is sitting "idle". As Abigail pointed out a proccess using 95 or 100% of cpu is not always bad, but you may want to renice it if you have other proccesses on your server that usually use that cpu and are being hit by your perl script running. Also take a moment to verify the size of your proccess loaded into memory. Do you have it build huge arrays/hashtables? are you forcing the machine to swap, and causing slowdown on the other services on the machine?

-Waswas

Replies are listed 'Best First'.
Re: Re: Re: Checking Perl script on server
by Anonymous Monk on Jul 30, 2003 at 15:02 UTC
    Thanks to both of you for the info. Yes I am building huge arrays in my finding and changing data recursively. Please advise if this way is using alot of memory?
    sub mySub { if( $_ =~ /\.html) { my $name = $File::Find::name; open ( F, $name ) || warn "Can\'t open File $name: $!\n"; while($line = <F>) { for $hit ($line =~ /matchdata/gi) { push @files, $name; } } close F; } } find( \&mySub, $dir ); foreach (@files) { open(LDATA, "$_") || warn "File does not open: $!\n"; @data = (<LDATA>); close(LDATA); open(LDATA, ">$_") || warn "File Write problem $_: $!\n"; foreach (@data) { s/OLD/NEW/gi; print LDATA $_; } close(LDATA); }
      I can't speak directly to efficiency as it's hardly my specialty, but I would suggest a change to this chunk:
      while($line = <F>) { for $hit ($line =~ /matchdata/gi) { push @files, $name; } } close F;
      Logically it seems like you want to check to see if the file you're currently examining has data that needs to be updated (you're saving the filename in @files for later processing) -- but you're pushing the filename onto your list once for each match in the file! Once you find a match in a particular file you should (based on my understanding of your goal) (1) push the filename onto your to-be-processed stack, and then (2) move on to the next file. So:
      while($line = <F>) { if ($line =~ /matchdata/i) { push @files, $name; last; } } close F;
      Note that I dropped the global flag for the match operator, too -- you just want to know if there's something there, anywhere, to be fixed later, and then move on.

      Naturally, if I misunderstand your goals this could be way off base :)

      Couple of things:
      1. if /matchdata/ has any capturing () than $name is pushed into @files for each ().
      2. When you do @data = (<LDATA>) you read the entire file into memory. A better way to do this would be:
      foreach (@files) { open(LDATA, "$_") || warn "File does not open: $!\n"; open(TMP, ">$_.tmp") || warn "File Write problem $_.tmp: $!\n"; while (<LDATA>) { s/OLD/NEW/gi; print TMP $_; } close(LDATA); close(TMP); rename("$_.tmp", "$_") or warn "Could not rename '$_.tmp' to '$_': + $!\n"; }
      Another side advantage of this is that if, for some reason, your script dies half way through then you still have our old file rather than a half written new file. hth.

      $will->code for @food or $$;

        Another side effect from the entire file slurp that you had done with the code is that perl will alloc enough memory to fit the whole file into memory, and because of the way most OSs handle releasing memory back to the system you will find that your memory usage just for that portion of code is >= the largets file you act on until your perl script stops executing. This may be an issue (depending on your OS) where the system will not realize what memory is being used activly and force other applications to swap out if your memory is constrained.

        -Waswas

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://279181]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others chanting in the Monastery: (4)
As of 2022-11-30 21:53 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    Notices?