I need to send files to a process whenever new files become available in a set of directories. The files can appear in the directories as quickly as every second to every minute. Each file should only be passed to the process (pqinsert) once and no files should be missed (both of these conditions must be satisfied). I wrote a Perl script to continuously poll the directories at a given interval, pass the file names to the external process, then move the processed files to an archive sub-directory.
Can you tell me if the code below is the most efficient to solve this problem and if it isn't give suggestions for improvement? Thank you so much!
#!/usr/bin/perl -w use strict; use warnings; use diagnostics; use File::Copy; $path = "/mnt/ldmdata/"; @site_array = {"karx", "kdlh", "kfsd", "kmpx", "kmvx", "kwbc"}; $poll_time = 20; # # of sec between polls of all specified directories for (;;) { foreach $site (@site_array) { $file_dir = $path . $site; $archive_dir = $file_dir . "/archive"; mkdir "$archive_dir", 0755 unless -d "$archive_dir"; opendir(FILE, $file_dir) || die "Cannot open $file_dir"; @files = readdir(FILE); closedir(FILE); if(@files) { foreach $file (@files) { pqinsert $file_dir . $file; move($file_dir . $file, $archive_dir . $file); } } } sleep $poll_time; }
UPDATE: I wanted to share a package I found which is built on "inotify" to perform monitoring/action tasks like this. It looks pretty robust:
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |