Beefy Boxes and Bandwidth Generously Provided by pair Networks
P is for Practical
 
PerlMonks  

Re: Perl::Magick maxes my systems processor usage

by BrowserUk (Patriarch)
on May 08, 2003 at 19:58 UTC ( #256658=note: print w/replies, xml ) Need Help??


in reply to Perl::Magick maxes my systems processor usage

You'll likely get some alternate opinions on this, especially from the SysAdmin types amongst us, but that the algorithm used makes full use of your cpu should be seen as a good thing not a bad. Assuming the algorithm is optimal, if the code didn't utilise 100% of your cpu, then it would indicate the there was a bottleneck somewhere and it would therefore take longer, in elapsed time to complete.

That's a gross simplification of a complex issue, but to my way of thinking, there is little point in having a powerful processor, and then not utilising it to its fullest when the opportunity allows this. Of course, if the processor is shared, and the utilisation of one process is to the detriment of another, more or equally urgent task, it could be seen as a 'problem', but that is what priorities are for.

If you need or desire to lower usage of that one task, to favour others, you might look at using Win32::Process and the SetPriorityClass() call to acheive it. You should? be able to use Win32::Process::open() to obtain a OS native handle to the current task from its pid, and adjust the priority from with the same script, although I haven't actually tried this.


Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller

Replies are listed 'Best First'.
Re: Re: Perl::Magick maxes my systems processor usage
by MrCromeDome (Deacon) on May 08, 2003 at 20:22 UTC
    This concerned me because this is such a trivial task: concatenate a few smallish images into a slightly larger one. And even for small images, it is taking several (5-7 seconds) per image. Using Powerbuilder and another imaging toolkit, I can do this much quicker, but it leaks memory like a sieve. I was hoping with Perl I could avoid the memory leak. Of course now I'm sucking up processor ;) I have to share this server with others, and my using too much processor is starving everyone else.

    I considered using Imager or ImageMagick, but Imager chokes on multi-page TIFFs. ImageMagick seemed to work well in a brief test I ran, and what I read in the docs encouraged me:

    Note that the QuantumDepth=8 version (Q8) consumes half the memory and about 30% less CPU than the QuantumDepth=16 version (Q16), but provides less color resolution. A Q8 version is fine for processing typical photos. If you are dealing with scientific or medical images or deal with images that have limited contrast, then the Q16 version is recommended. It is also possible to build a Q32 version which has enough resolution to deal with the latest reconnaissance images. Please let us know if there is any demand for the Q32 versions.

    Given the nature of the task at hand, I thought the Q8 version would work well. It works, but not well :P

    ++ for the excellent suggestion - I will have to look into that. I hope it works as well as it sounds :)

    Thanks a bunch!
    MrCromeDome

      I can't really comment as I gave up trying to get Image::Magik going on my machine after 3 attempts. I rarely have the need to automate image stuff, and have (what I consider) to be superior tools available for doing this sort of thing manually.

      (If you've never seen the performance and tightness of the Xara graphics tools, particularly Xara X, take a look, they have free demos available for download. Note:Just a very satisfied user. No commercial or personal relationships with the company.)

      Anyway, my speculation would be that at some point, the image matrices in Image::Magic are stored and manipulated using perl array's, possibly they are expanded on demand rather than preallocated and it is the repeated, incremental increase in the size of these arrays that is responsible for the memory (TIFFs can be very memory hungry) consumption, and consequently the cpu consumption.

      Manipulating large, but relatively static chunks of binary image data is one of the few applications where perl's arrays aren't really of benefit.


      Examine what is said, not who speaks.
      "Efficiency is intelligent laziness." -David Dunham
      "When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller
      his concerned me because this is such a trivial task: concatenate a few smallish images into a slightly larger one. And even for small images, it is taking several (5-7 seconds) per image.

      I'd say it's about time to get rid of that 80486 if that disturbs you so much. :)

      Leonid Mamtchenkov aka TVSET

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://256658]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others cooling their heels in the Monastery: (6)
As of 2023-12-11 21:34 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    What's your preferred 'use VERSION' for new CPAN modules in 2023?











    Results (41 votes). Check out past polls.

    Notices?