Beefy Boxes and Bandwidth Generously Provided by pair Networks
Don't ask to ask, just ask
 
PerlMonks  

Perl::Magick maxes my systems processor usage

by MrCromeDome (Deacon)
on May 08, 2003 at 18:44 UTC ( [id://256634]=perlquestion: print w/replies, xml ) Need Help??

MrCromeDome has asked for the wisdom of the Perl Monks concerning the following question:

I've posted this same question to an Image::Magick forum, but figured if someone here had some insight, I'd appreciate it that much more.

I'm writing an image conversion in Perl. It takes single-page Group III TIFF images and writes them into a multi-page Group 4 TIFF file. Along the way, it creates new indexing information in our database so we can find the new image (as opposed to the old ones). The problem is that when Perl attempts to write the image to disk, my processor utilization hits 100%. When writing is complete, CPU usage drops and processing picks up till the next image write.

I have extracted enough of my code to illustrate my point:
# Create a new image handler my $img = Image::Magick->new; # Read database, set some values here # Read the image $success = $sth_multi_select->execute($image_id); print LOG "Can't get page information for document $document: ", $dbh_source->errstr, "\n" unless defined $success; next unless $success; $sth_multi_select->bind_columns( \$page_number, \$old_filename, \$old_directory, \$old_volume); while($sth_multi_select->fetch()) { my $path = "$volume_list{$old_volume}\\" . ($old_directory eq "" ? "" : "$old_directory\\") . "$old_filename"; $success = $img->read($path); print LOG "Can't read document $document: ", $success, "\n" if $su +ccess; } # Write index entries to database here # Write out the new image - GETS SLOW HERE! my $path = "$dest_path$filename"; print "Before\n"; $success = $img->write($path); print "After\n"; print LOG "Can't write document $document: ", $success, "\n" if $succe +ss; undef $img;
As my program executes, it appears to hang for a REALLY long time after printing "Before". I checked the task manager at this point and see my processor usage at 100%. As soon as "After" prints, I check the task manager again and my CPU usage is back to an acceptable level. No errors are triggered, and the image I expect to be created is (with the proper number of pages). This type of behavior occurs even for one and two page documents, where the resulting image is only about 30-50k or so.

I am using the latest ActiveState Perl 5.8.0 and ImageMagick-5.5.6-Q8-windows (Q8 boasts less memory and CPU requirements than Q16) on Windows NT Server 4.0 SP 6.

Any insight is appreciated. Thank you!
MrCromeDome

Replies are listed 'Best First'.
Re: Perl::Magick maxes my systems processor usage
by BrowserUk (Patriarch) on May 08, 2003 at 19:58 UTC

    You'll likely get some alternate opinions on this, especially from the SysAdmin types amongst us, but that the algorithm used makes full use of your cpu should be seen as a good thing not a bad. Assuming the algorithm is optimal, if the code didn't utilise 100% of your cpu, then it would indicate the there was a bottleneck somewhere and it would therefore take longer, in elapsed time to complete.

    That's a gross simplification of a complex issue, but to my way of thinking, there is little point in having a powerful processor, and then not utilising it to its fullest when the opportunity allows this. Of course, if the processor is shared, and the utilisation of one process is to the detriment of another, more or equally urgent task, it could be seen as a 'problem', but that is what priorities are for.

    If you need or desire to lower usage of that one task, to favour others, you might look at using Win32::Process and the SetPriorityClass() call to acheive it. You should? be able to use Win32::Process::open() to obtain a OS native handle to the current task from its pid, and adjust the priority from with the same script, although I haven't actually tried this.


    Examine what is said, not who speaks.
    "Efficiency is intelligent laziness." -David Dunham
    "When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller
      This concerned me because this is such a trivial task: concatenate a few smallish images into a slightly larger one. And even for small images, it is taking several (5-7 seconds) per image. Using Powerbuilder and another imaging toolkit, I can do this much quicker, but it leaks memory like a sieve. I was hoping with Perl I could avoid the memory leak. Of course now I'm sucking up processor ;) I have to share this server with others, and my using too much processor is starving everyone else.

      I considered using Imager or ImageMagick, but Imager chokes on multi-page TIFFs. ImageMagick seemed to work well in a brief test I ran, and what I read in the docs encouraged me:

      Note that the QuantumDepth=8 version (Q8) consumes half the memory and about 30% less CPU than the QuantumDepth=16 version (Q16), but provides less color resolution. A Q8 version is fine for processing typical photos. If you are dealing with scientific or medical images or deal with images that have limited contrast, then the Q16 version is recommended. It is also possible to build a Q32 version which has enough resolution to deal with the latest reconnaissance images. Please let us know if there is any demand for the Q32 versions.

      Given the nature of the task at hand, I thought the Q8 version would work well. It works, but not well :P

      ++ for the excellent suggestion - I will have to look into that. I hope it works as well as it sounds :)

      Thanks a bunch!
      MrCromeDome

        I can't really comment as I gave up trying to get Image::Magik going on my machine after 3 attempts. I rarely have the need to automate image stuff, and have (what I consider) to be superior tools available for doing this sort of thing manually.

        (If you've never seen the performance and tightness of the Xara graphics tools, particularly Xara X, take a look, they have free demos available for download. Note:Just a very satisfied user. No commercial or personal relationships with the company.)

        Anyway, my speculation would be that at some point, the image matrices in Image::Magic are stored and manipulated using perl array's, possibly they are expanded on demand rather than preallocated and it is the repeated, incremental increase in the size of these arrays that is responsible for the memory (TIFFs can be very memory hungry) consumption, and consequently the cpu consumption.

        Manipulating large, but relatively static chunks of binary image data is one of the few applications where perl's arrays aren't really of benefit.


        Examine what is said, not who speaks.
        "Efficiency is intelligent laziness." -David Dunham
        "When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller
        his concerned me because this is such a trivial task: concatenate a few smallish images into a slightly larger one. And even for small images, it is taking several (5-7 seconds) per image.

        I'd say it's about time to get rid of that 80486 if that disturbs you so much. :)

        Leonid Mamtchenkov aka TVSET

Re: Perl::Magick maxes my systems processor usage
by MrCromeDome (Deacon) on Jul 03, 2003 at 17:28 UTC
    Just checking through my nodes. Found an answer to this some time ago, and thought I would post a response for anyone who stumbles into the same situation:

    The bottleneck would appear to be the TIF image compression. If you disable compression in the following manner:

    $success = $img->write(filename=>$path, compression=>'None');
    the program screams. Of course, as the purpose of my script is to save space, this isn't optimal ;)

    The compression functionality may work faster on a *nix platform, which is unfortunately not an option for what I'm doing here. As such I have done no testing to either prove or disprove this. So you may wish to take this with a grain (if not a pound) of salt if attempting this on *nix.

    Good luck!
    MrCromeDome

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://256634]
Approved by silent11
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chanting in the Monastery: (5)
As of 2024-03-29 13:42 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found