Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Thank you all for the past responses, but all solutions have failed. Partly because I did not explain clearly enough. This is a tough one. The problem:
a file shows up in /main, it needs to move to /dir1 another file shows up in /main, it moves to /dir2 ditto for /dir3 through /dir6 then we need to start again at /dir1 a round-robbin approach to file distribution.
Now here is the seemingly impossible parts: say there are two files in main and the last time the program ran, it had moved a file to /dir2, it now needs to put one file in /dir3 and the other in /dir4. (date/time stamp of files is no issue as this will be a cron job or it will sleep and run every 5 minutes) Say this program runs again and there is a file in /main, it now needs to start at /dir5. And as if this isn't bad enough, if there is no file there, it needs to remember the last /dir# it was at and pick up from there! aaargh! I'm thinking the program can read a file with all the /dir#'s into an array, shift them somehow and white back out to the file. Do a substr to take out the valid /dir# then FORMAT a executable sh file with 'mv /main @<<<<<<' type thing. Assuming this is not impossible, Thanks for any code help I can get!

Replies are listed 'Best First'.
Re: sequential file handling (yet again)
by dws (Chancellor) on Jun 03, 2002 at 19:26 UTC
    Thank you all for the past responses, but all solutions have failed. Partly because I did not explain clearly enough.

    Or perhaps there's a different reason.

    Before we launch into round n+1, please tell us why the various solutions that have been proposed do not work for you. It would seem straightforward for a novice Perl programmer to extend them to deal with the requirements you've laid out. Which have you tried? How did they fail?

    That said, I'm still concerned that may be taking a bad approach to this problem, for reasons I laid out in a previous response. I've dealt with systems that assigned support "tickets" to agents. Strict round-robin doesn't work at all well.

Re: sequential file handling (yet again)
by particle (Vicar) on Jun 03, 2002 at 19:35 UTC
    why did my response at Re: sequential file handling (again) fail? why did the other responses on your many threads fail?

    you've been going in circles here asking the same question (albeit slightly refined) over and over, airblaine. this is tiresome. please post some code and stop wasting our time by inventing yet another solution for you.

    that said, i'll give you some advice-- first break your problem down into smaller problems, and solve each seperately. then put them together to form the whole. suggestions for smaller problems include:

    move a file
    select a directory
    store state information

    ~Particle *accelerates*

Re: sequential file handling (yet again)
by Abigail-II (Bishop) on Jun 04, 2002 at 09:47 UTC
    #!/usr/bin/perl use strict; use warnings qw /all/; # Program picks up files in one directory, and moves them round robin # to a set of other directories. # It's required to keep track of directory a file was moved to last. use Fcntl qw /:DEFAULT :flock :seek/; my $pickup = "/tmp/main"; my @targets = map {"/tmp/target/$_"} qw /dir1 dir2 dir3 dir4 dir5 dir +6/; my $state = "/tmp/state"; # Open the state file for read/write, creating it as necessary. sysopen my $state_h => $state, O_RDWR | O_CREAT, 0666 or die "Failed to sysopen $state: $!\n"; # Lock the file exclusively. This has the additional effect no other # invokation will try to move the same files we do. flock $state_h => LOCK_EX or die "Failed to flock $state: $!\n"; # Read the state. If the file didn't exist, this will return undef. my $last_dir = <$state_h>; # If defined, chomp, and rotate the target directories until we found # the directory the last file was written to. If not defined, roll one # directory in the opposite direction. if (defined $last_dir) { chomp $last_dir; push @targets => shift @targets while $last_dir ne $targets [0]; } else { unshift @targets => pop @targets; } # Open the pickup directory and read its content. opendir my $pickup_h => $pickup or die "Failed to opendir $pickup: $!\ +n"; my @files = grep {$_ ne "." && $_ ne ".."} readdir $pickup_h; closedir $pickup_h; # For all the files to be moved, roll the target directories so we get # the next one, and then move the file to the destination. We're assum +ing # we work on the same file system and we can use 'rename'. foreach my $file (@files) { push @targets => shift @targets; rename "$pickup/$file" => "$targets[0]/$file" or die "Failed to rename $pickup/$file\n"; } # Record the last directory we wrote into, and close the file; # closing the file will release the lock. seek $state_h => SEEK_SET, 0 or die "Failed to seek $state: $!\n"; print $state_h $targets [0], "\n"; close $state_h or die "Failed to close $state: $!\n"; exit;
Re: sequential file handling (yet again)
by Zaxo (Archbishop) on Jun 03, 2002 at 19:50 UTC

    If your program is the only one writing to these dirs, you can sort them by mtime, as you can for files that appear in /main:

    my @dirs = ('/dir1/', '/dir2/', '/dir3/', '/dir4/', '/dir5/', '/dir6/' +,); my @newfiles = sort {-M $b <=> -M $a} grep {-f } glob("/main/*"); my @seq = (sort {-M $b <=> -M $a} @dirs) x int( 1 + @newfiles/@dirs); for (@newfiles) { my $status = system '/bin/cp', '-f', $_, shift @seq; die 'Copy failed for ', $_, ': Child Error ', $?, $! if $status; unlink $_ unless $status; }
    If mtime of the inodes is not helpful, you'll need to keep some sort of persistent data file. I also would like to know why each of the other solutions failed.

    After Compline,
    Zaxo

Re: sequential file handling (yet again)
by insensate (Hermit) on Jun 03, 2002 at 19:45 UTC
    There is no reason this shouldn't work...please /msg me if you are having problems...
    Remember that win32 paths need to contain \\ instead of \... so c:\main would be c:\\main embedded in your script.
    This script doesn't sleep...I wrote it to test your condition quickly.
    -Jason
    use File::Copy; @dirs=qw(FullPathToDir1 FullPathToDir2 etc); while(1){ $dir="FullPathToMain"; opendir(MAIN,$dir) or die "Can't open $dir:$!"; while(defined($file = readdir MAIN)){ next if $file =~/^\.\.?$/; $path=$dir.$file; $newdir=@dirs[$i]; move($path,"$newdir$file") or die "move failed:$!"; $i++; undef $i if ($i>$#dirs); } }