Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW
 
PerlMonks  

Running multiple jobs with dependencies

by Anonymous Monk
on Apr 24, 2004 at 13:45 UTC ( [id://347846]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi, I'm trying to transcribe a make script into perl. The script is very simple...

1) run a job
2) if the job completes properly, it will do a "touch file.name"
3) if the "file.name" exists, execute the next job

I think NFS is killing me, though--the file may get created, but the perl program is executed so quickly that it doesn't see the "file.name" in time and exits.

Here is what's not working:-

for ($c = 1; $c < 17; $c++) { if (fork() == 0) { exec ("CAD program with $c options...details not impor +tant"); exit; } wait; system ("sleep 20"); #so NFS gets updated...? if (-e "cmdos/$n[$c].done") { $complete = 1; #in other words: do nothing } else { die ("Dependency file $n[$c].done did not get created- +-exiting!\n"); } }
The dependency file *did* get created, but it took a few miliseconds to show up. How can I prevent abnormal termination? I don't think "sleep" is working the way I expect.

Thank You,
-Fiddler42

Replies are listed 'Best First'.
Re: Running multiple jobs with dependencies
by waswas-fng (Curate) on Apr 24, 2004 at 14:14 UTC
    if "cmdos/$n[$c].done" gets created from the exec above, and it is created within 20 seconds and you are waiting for it to create anyways why not forget fork() and exec() and just system()? The added advantage is that you can tell the exit stsatus of the cad program.


    -Waswas
Re: Running multiple jobs with dependencies
by Fletch (Bishop) on Apr 24, 2004 at 14:27 UTC

    Not that the system suggestion's not a bad idea, but if you're going to call wait why not just have the child return a zero/non-zero exit status and look at $? rather than using the file.

Re: Running multiple jobs with dependencies
by Happy-the-monk (Canon) on Apr 24, 2004 at 13:51 UTC

    Perl has its own sleep function, see perldoc -f sleep.

Re: Running multiple jobs with dependencies
by dave_the_m (Monsignor) on Apr 25, 2004 at 16:26 UTC
    NFS should be self-consistent on a single client - ie all processes on the same client should have the same view of a file on an NFS server. Different clients may have different views, but that shouldn't be an issue here. The only complicating factor is that timestamps may be in the past/future, since the clock on the server may be out of sync with that of the client.

    Hence I suspect that your issue may have its cause elsewhere.

Re: Running multiple jobs with dependencies
by Ryszard (Priest) on Apr 25, 2004 at 13:48 UTC
    Is there anything preventing you from using the local FS? try /tmp or some such?

    OTOH, why not set up a contoller? Any number of way's it can be done, a script that calls everything in turn, something that listens on a socket (and will recieve an "Am i good to go" message/"I'm finished"

    There are other ways to preserve state, a database, a lock file (that you're using), signals etc etc..

Re: Running multiple jobs with dependencies
by rupesh (Hermit) on Apr 26, 2004 at 12:02 UTC
    Use  $|++; at the beginning of your code.

      Better yet, use $| = 1; and eschew obfuscation.


      If anyone needs me I'll be in the Angry Dome.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://347846]
Approved by Happy-the-monk
Front-paged by broquaint
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others wandering the Monastery: (4)
As of 2024-04-19 23:14 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found