r.joseph has asked for the wisdom of the Perl Monks concerning the following question:

Is there any danger in having a perl script running the background all the time in an infinte loop, as long as you know that there will be no memory leaks or anything?

Replies are listed 'Best First'.
Re: Infinte perl script
by Fastolfe (Vicar) on Jan 05, 2001 at 08:56 UTC
    God I hope not, since I'm doing that an awful lot.

    Seriously, I can't imagine why this would be a problem. What would make you uneasy about it? Just don't make your script run in a tightly infinite loop, working all the time and chewing up CPU 24/7.. I have stuff running 24/7, but they're either sleeping or waiting on select or something for the vast majority of that time.

      Right, sounds good...how long do you have them sleep for usually? Because I just have a while(1) { } loop that waits for a certain file to be deleted, and when it is, it does something. That shouldn't chew up too much cpu time, right?
        Just guessing, but you may want to rethink a bit; how quickly do you need to respond to the file being deleted? Cron (if you've got it) might be a better way if 15/30 minutes can go by. Can you get the deleting process to fire off your script? A non-stop script isn't really a problem, but, er, "elegance" (or something) makes an "is this file gone yet?" every 10 seconds script seem a bit ugly. Unless your response time needs to be really w/i a 'sleep xx' range, get somebody else to do the work. IMHO

        a

        Trying running perl -e 'while(1) { 1 }' sometime and see what that does to your CPU. :-)

        As for how long to sleep, it depends on what you're doing: sometimes sleep(1) is enough, othertimes something a little (or a lot) longer is preferable.

        Definitely something for you to experiment with, but the question you need to ask is how long can you afford to postpone another iteration of the loop?

        Also, if you're feeling particularly nutty, you can use...

        while ( sleep(10) ) { ... }

        ... instead of the more traditional (?) ...

        while ( 1 ) { ...; sleep(10); }

        Although the former ends up being as if sleep(10) was the first thing called inside the loop.

            --k.


        I hope you are not trusting the file-system to be atomic, because it is not. If you are using lock-files, you should look at flock. I have a reasonably straight-forward example of how to use it at Simple Locking.

        It can. Remember you're checking if that 1 is still 1 on every loop. How long does a file usually stick around before being deleted?

        Update: And of course the file test itself. Depending on how your disk caching is happening your results may vary.

        My first big project had files being written across Samba mounts and then -e tests as well. CPU time wasn't much of factor, but the network could get clogged.
        Getting the various perl programs talking directly to each other was much faster. So my experience might have nothing to do with your stuff

        How long to sleep depends on the granularity of what you need. If the file gets deleted and you need to do something immediately, then a small sleep. If you just generally need to know it, perhaps a sleep 300 or <code>sleep 600<code> for 5 or 10 minutes would be appropriate. If you need to do something by the next day, sleep for an hour or more, who cares.

        However, if your delay is big, a better answer might be to use a scheduler to run you check program instead of idle in the background. On a *nix box use cron, not sure on Win32 box. This isn't as worthwhile with a small delay because of the overhead of loading Perl.

        While this may not use much CPU resources, since you are doing I/O, it is also using disk resources. I'd set the delay as long as you can stand.

        =Blue
        ...you might be eaten by a grue...

        Howdy -

        This is my first attempt at an answer to a question so go easy on me if i screw it up folks :P

        I have a chunk of code that runs a df command on a specified directory, and will loop every x seconds as you choose. You can find it here for your reference ( the part you need is in the second script , ds.pl).

        For quickness, the code i use to loop it is -

        until (! $opt{time}) { # loop every specified seconds { check_disk_space($opt{dir} ) ; # run my df subroutine on specified d +irectory sleep ($opt{time}) # wait specified time, then begin again }
        What this basically does is uses the GetOpt::Long module to accept and process command-line parameters entered by the user, and then uses the untilstatement to run the loop according to the parameters entered.

        So what you could do in theory is put a line in your cron to start it (rjoseph.pl -dir =/usr/bin/perl -time=120) which would then run the script every two minutes on the /usr/bin/perl directory.

        As I said, this is my first *serious* answer, so if I have rambled, I apologise, but I sincerely hope this helps, if only on the sleep thing. I hope you like Getopt though, cause it's been totally handy for me.

        Cheers, jim
        if ($mr_leisure) { bow; }
        this is still not finished
Re: Infinte perl script
by jeroenes (Priest) on Jan 05, 2001 at 16:06 UTC