in reply to Memory management with long running scripts
First, Perl 5.8.8 is a good version to be stuck with. I have several production systems using 5.8.8 for years without difficulty.
Since you say that the "long running perl scripts (daemons)" run for at least days, why not add code to the script to restart itself after 24 hours have elapsed. Immediately that takes the pressure off you.
Another approach is to restart at a specific time ( like 3:22AM ). Pick a time when you have the least usage. For this we use crontab to schedule at 3:22 each day:
and at 3:32 each daytouch "/var/RestartPerlDaemons"
During the 10 minutes we do some cleanup, but if you don't need to do that then just do the remove at 3:23. Obviously the Perl scripts have to check for the existence of the file and close down, and then not restart until the file is removed. Don't check on every cycle, but use time to check every 10 seconds of so ( saves on stats ).rm "/var/RestartPerlDaemons"
Memory leaks are among the most difficult problems to isolate. Others have given some good ideas to find the leaks, but you sound very frustrated by the situation.
In a *nix forked environment, after some specified time, the children exit and the parent forks a new child. To give you some idea of the variables, on AIX the children exit after 8 hours, in some Linux systems it ranges from 2 hours to 12 hours. But to restart a clean child takes seconds.
Perl depends on the system libraries, and if they have 'leaks', Perl is going to have leaks. Since you can't change your system, you need to minimize the problem for you.
Good Luck!
"Well done is better than well said." - Benjamin Franklin
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Memory management with long running scripts
by jamesrleu (Novice) on Aug 08, 2012 at 13:28 UTC | |
by flexvault (Monsignor) on Aug 08, 2012 at 15:20 UTC |