in reply to Large FTP task

Cody Pendant:

Short answer: I'd use your first suggestion. You might also want to lookup Net::FTP::AutoReconnect. (I found it when looking up Net::FTP for my answer below.)

Details: I had a similar problem a couple of years ago. I was using FTP to get files from our mainframe, and it would time out your connection if it detected two minutes of idle time. Unfortunately, sometimes the tape robot would take more than two minutes to get the file.

So I created a subroutine that would accept a list of files to get. If anything failed, I'd detect it, close the connection and open a new one. I'm going from memory (and the Net::FTP page), but it went something like:

#/usr/bin/perl -w use strict; use warnings; my $host; my $uid; my $pwd; my $ftph; my %FList = ( 'BCLR7650(-1)' => 'BCL.SDBA.TRANSFER', 'BCLR7651(-1)' => 'BCL.SDBA.TRANSFER', 'BCXD5001(-1)' => 'BCL.SDBA.OUTGOING', ); &GetFiles(%FList); sub Connect { my $ftph = Net::FTP->new($host) or die "Can't connect"; $ftph->login($uid,$pwd) or die "Can't login!"; } sub GetFiles { my $cntFails=1; while ($cntFails>0) { $cntFails=0; &Connect; for my $FName (keys %FList) { next if !defined $FList{$FName}; print "Getting $FName from $FList{$FName}\n"; if ($ftph->cwd($FList)) { if ($ftph->get($FName)) { $FList{$FName} = undef; } else { ++$cntFails; } } else { ++$cntFails; } } $ftph->quit; } }
It went something like that. Yes, it used crappy global variables and such. (And the directory hierarchy on the mainframe *is* goofy, as it uses "." as directory separators. Yechh!). But at least it got the job done (Unless, of course, you spelled a directory or file name wrong. It was single-minded in it's determination, and would try all night long, trying to get the file.)

--roboticus

Replies are listed 'Best First'.
Re^2: Large FTP task
by Gauri (Initiate) on Jun 29, 2006 at 17:00 UTC
    Hi Roboticus, I have a similar problem. I'm trying to FTP GET certain files from mainframes to zLinux. I was unable to retrieve files on tape. and I tried using the following command FTP->new($MVSADDR,Timeout=>1800,Debug=>1); It seems to work. However if I try to FTP GET files which more than size 100 MB -located either on disk or tape, A zero byte file is downloaded to Linux! I tried the above solution of using while loop. But I still get a zero byte file and a message "250 Transfer completed successfully." I think because of this 250 message, the code exits out of the while loop. How do I get around this problem? Can anyone help? -Thanks, Regards, Gauri
      Gauri:

      I'm afraid I won't be much help there ... it just works on my system. I've retrieved files as large as 3GB without any problems other than finding the space to put the darned thing!

      But here are a couple of things I'd look at:

    • If you can read smaller files but not the 100MB ones, perhaps you should alter the timeout to see if that affects it.
    • Have you tried turning on debug mode in Net::FTP to see if it has any clues for you?
    • A bit tedious--but you might consider monitoring the traffic with Ethereal.

      --roboticus