bofh_of_oz has asked for the wisdom of the Perl Monks concerning the following question:

Here is a project I'm working on and the mess it got me into:

The Idea:
I need to create a script that would monitor performance on Windows and Linux servers, and log it to an mySQL database for creating reports later. (I will convert it to .exe with perl2exe to avoid having to install Perl on all the boxes).

My Thoughts:
Since Disk utilization does not change every often, I would check it once a day, and send the data to the DB - no problem here...

Memory monitoring - once every minute seems reasonable, and sending one short update query to mySQL every minute should not be that bad...

CPU - I guess I need to get usage data at least every second or even more often...

The functionality for performance monitoring either is present in currently available modules, or could be obtained via system calls. Net::MySQL is good enough as a DB interface.

The Problems:
1. For memory and especially CPU monitoring, I obviously should have the script/program running all the time. What I couldn't find out is how well does a Perl script perform as a memory-resident program? Any memory leaks, garbage collection problems etc.? I tried to look at doing a multithreaded program, but documentation warned that it (the multi-threading implementation) could be obsolette as it changes frequently...

2. How can I implement timers in the script? I mean, not count-down ones, but rather something like a scheduler, so that I can run all three monitoring jobs (CPU, memory, HDD) from one script at different intervals.

3. When I monitor CPU performance or other frequently changing data, what would be the best way to send that info to the SQL database? Since executing an update query every second doesn't strike me as very efficient, I would think of something like adding the SQL queries together for a while and executing a batch update say, once a minute, or even less frequently. But that brings me to the first question again, in the sense that I don't know how well Perl would free the memory used to store the SQL queries, after I executed them.

Any thoughts and pointers are appreciated.
Many thanks,
Eugene

  • Comment on Logging system performance into a DB from Perl TSR: a few questions pulled together

Replies are listed 'Best First'.
Re: Logging system performance into a DB from Perl TSR: a few questions pulled together
by NetWallah (Canon) on Apr 20, 2005 at 16:08 UTC
    There are several pre-invented wheels you could use for these purposes - I suggest you consider those before building yours.:

    • Windows performance monitor already has a mechanism for periodically gathering performance info.
    • MRTG can graph the info - and there are contributed mechanisms to import Win PerfMon data
    • The underling collection mechanism is ,SNMP, which can get data for both Linux and Windows
    In Windows, you will need to add the SNMP service (Add-remove progs -> Win components ---)

         "There are only two truly infinite things. The universe and stupidity, and I'm not too sure about the universe"- Albert Einstein

Re: Logging system performance into a DB from Perl TSR: a few questions pulled together
by tcf03 (Deacon) on Apr 20, 2005 at 15:42 UTC
    For memory stuff look at vmstat threshold script. The CPU stuff Id use SAR. Look at AIX (its not for linux or windows, but conceptually it does the trick) NMON, perhaps something using rrdtool and mrtng. Theres plenty of tools out there, why reinvent the wheel.

    Good luck

    Ted

    These are just ideas, don't crucify them.