Short answer: I'd use your first suggestion. You might also want to lookup Net::FTP::AutoReconnect. (I found it when looking up Net::FTP for my answer below.)
Details: I had a similar problem a couple of years ago. I was using FTP to get files from our mainframe, and it would time out your connection if it detected two minutes of idle time. Unfortunately, sometimes the tape robot would take more than two minutes to get the file.
So I created a subroutine that would accept a list of files to get. If anything failed, I'd detect it, close the connection and open a new one. I'm going from memory (and the Net::FTP page), but it went something like:
It went something like that. Yes, it used crappy global variables and such. (And the directory hierarchy on the mainframe *is* goofy, as it uses "." as directory separators. Yechh!). But at least it got the job done (Unless, of course, you spelled a directory or file name wrong. It was single-minded in it's determination, and would try all night long, trying to get the file.)#/usr/bin/perl -w use strict; use warnings; my $host; my $uid; my $pwd; my $ftph; my %FList = ( 'BCLR7650(-1)' => 'BCL.SDBA.TRANSFER', 'BCLR7651(-1)' => 'BCL.SDBA.TRANSFER', 'BCXD5001(-1)' => 'BCL.SDBA.OUTGOING', ); &GetFiles(%FList); sub Connect { my $ftph = Net::FTP->new($host) or die "Can't connect"; $ftph->login($uid,$pwd) or die "Can't login!"; } sub GetFiles { my $cntFails=1; while ($cntFails>0) { $cntFails=0; &Connect; for my $FName (keys %FList) { next if !defined $FList{$FName}; print "Getting $FName from $FList{$FName}\n"; if ($ftph->cwd($FList)) { if ($ftph->get($FName)) { $FList{$FName} = undef; } else { ++$cntFails; } } else { ++$cntFails; } } $ftph->quit; } }
--roboticus
In reply to Re: Large FTP task
by roboticus
in thread Large FTP task
by Cody Pendant
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |