Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

What are ways to do a cron job in NT? What I want to do is listen to a FTP folder and when a new file is inserted, run a perl application on that file. Any help is appreciated. Thanks.

Replies are listed 'Best First'.
Re: Cron Job??
by Corion (Patriarch) on Apr 03, 2001 at 23:40 UTC

    Even though this is a bit off-topic here...

    There is the at.exe program for the at service. Just type at --help or at /? at the command prompt to get a description of how to run at.

    There is no such thing as crontab under NT, you'll have to log in as (or otherwise change to) the user you want your at job to be run as.

    Update : arturo alerted me to the fact that the previous paragraph is quite unclear : It is possible to have jobs run as (almost any) user, but to set up that job you must log in as that user (or use the su.exe utility from the NT Resource Kit).

      Of course, you could write a service to this rather than using Scheduler (AT). You can do this is Perl if you use Win32::Daemon. This also gets around the user permissions problem above. Its quite straightforward to do this and the module comes with a couple of template examples. Running as a services also has the advantage that it can be set to start automatically on bootup (and restart if failed in Win2K).

      OTOH, you can script the AT command with Win32::AdminMisc. Use Win32::AdminMisc::ScheduleAdd(). The script that is run by AT, can then use Win32::AdminMisc::LogonAsUser() to impersonate another user. </code>

      $ perldoc perldoc
Re: Cron Job??
by Albannach (Monsignor) on Apr 04, 2001 at 00:04 UTC
    While at.exe is on your box already, you might also want to try cygwin for which there is a cron port.

    It also strikes me that unless your ftp directory is very little used, a task that runs only periodically will either have to repeat often (say every minute, with corresponding startup costs) or the users will endure poor response time, assuming of course the task produces some output that someone wants in a timely fashion, which may not be true. Perhaps you would be better off with a task that was running all the time (with the corresponding loss of resources, i.e. about 10MB RAM when I do this sort of thing on my NT box) that would sleep for 15 sec. or so between checks of the directory.

    I was also going to recommend Roth Consulting but I see $code or die already has, so I'll just second that!

    --
    I'd like to be able to assign to an luser

Re: Cron Job??
by kschwab (Vicar) on Apr 04, 2001 at 00:18 UTC
    Once you have figured out how to schedule the job, the next problem you'll encounter is how to avoid a race condition. Basically, how can the script tell that the file is "done" being ftp'ed in.

    The best solution is to have the ftp client rename the file when it's done uploading.

    There's more info over on this node.

(Guildenstern) Re: Cron Job??
by Guildenstern (Deacon) on Apr 04, 2001 at 01:03 UTC
    If your "FTP folder" is actually on the NT box, Win32::ChangeNotify would be a good solution. You can basically set your script to wait until a change in the directory (i.e. adding files) happens, then run whatever code you need to. Less messy than playing with NT's dog-awful at command.

    Update: I knew I had some code that used it somewhere. Here's a simple example:
    use Win32::ChangeNotify; my $dir = "c:/some/dir/name"; $notify = Win32::ChangeNotify->new($indir, 0, FILE_NAME); while (1) { $notify->wait or warn "Problem waiting: $!\n"; # Will now wait to execute following code # until a file event happens in $dir # ...stuff ... $notify->reset; } $notify->close;

    I put the while (1) for brevity. In my actual code, I test to see if the script should be "finished" by checking for existence of a semi-lockfile and do while (!$done).

    Guildenstern
    Negaterd character class uber alles!
Re: Cron Job??
by diskcrash (Hermit) on Apr 04, 2001 at 00:58 UTC
    kschwab makes a very good point. How do you know when the transfer is done? I used the following to see if the file was modified at least ten seconds ago (Hopefully avoiding a race condition):

    $getinfile=$indir.$infile; ($device,$inode,$mode,$nlink,$uid,$gid,$rdev,$size,$atime,$mtime,$ctim +e, $blksize,$blocks) = stat($getinfile); my($timenow)=time; $delta=$timenow-$mtime; if ($delta < 10) #***file too young, wait a few seconds { next; #***jump to the end of the loop }

    I haven't actually seen a race condition, but, like kschwab, worried about it. I would take advice on any (more coherent) means of assuring a file was closed.

    -Diskcrash

Re: Cron Job??
by lhoward (Vicar) on Apr 04, 2001 at 01:08 UTC
    I don't know if this will work with your FTP server. But under WUFtpd on Unix I solved the problem by tailing the xfer log. The transfer message isn't written to the log until the transfer is complete so you avoid any race conditions. Also you save the overhead of having to launch poll directories when they aren't changing.
Re: Cron Job??
by diskcrash (Hermit) on Apr 04, 2001 at 00:48 UTC

    I had the same identical problem and solved it with a very low tech / low rent approach. My "ftp listener" script does a sleep for a minute and checks the directory for new files. It is installed using instsrv and srvany. It has the KISS mentality and works very reliably with very low overhead. Avoid chdir and use full path (dev and dir) references to all directories. Its running on NT 4.0 SP6a.

    -Diskcrash (prince of crummy, low style code..)

      Could you post your "ftp listener" script? Thanks.