vortmax has asked for the wisdom of the Perl Monks concerning the following question:

I have several CMTS (Cable modem Termination systems) on remote sites that I backup via FTP on a daily basis. My script (below), recursively pulls the server's ip from a database, checks to see if the server is alive via SNMP, starts the FTP session, pulls the files, parses one of them into a database, then tars them in the appropriate folder. Sometimes when I run this script, it doesn't download the complete file before moving on, but the ftp process doesn't return any sort of error. When I notice this happens, I run a second script that uses the same subs, but only pulls in one site's worth of files....and it doesn't seem to have this issue. Any thoughts on how to suppress this or to at least catch it so I can attempt the download again? The subroutine in question is below. I suppressed the entire script because of length but can post it too if needed.
sub getFiles{ ### ftp constants ########### my $ip = @_[0]; my $dir = '/ata00'; my $user = 'xxxxx'; my $pass = 'xxxxx'; my $dbg = 0; if ($debug) {print "\tConnecting to $ip\n "; $dbg = 1; } if(CMTScheck($ip)){ return 1;} #if CMTS cannot be contacted via SNM +P, die $ftp = Net::FTP->new("$ip", Debug => $dbg) or $error=1; +#initialize connection if($error){ #if error, quit print "\tServer is not responding\n"; return 1; } ##### Login ############# $ftp->login("$user","$pass") or $error=1; if($error){ print "Username and Password not accepted"; $ftp->quit; return 1; } $ftp->binary; #set binary mode $ftp->cwd('/ata00'); #change to /ata00 di +r #### Retrieve files ######### $ftp->get('dhcpd.con') or $error=1; if($error){ print "Could not retrieve dhcpd.con...... quitting"; $ftp->quit; return 1; } $ftp->get('smsact.db') or $error=1; if($error){ print "Could not retrieve smsact.db...... quitting"; $ftp->quit; return 1; } $ftp->quit; return 0; }

Replies are listed 'Best First'.
Re: Net::FTP incomplete downloads
by pc88mxer (Vicar) on Apr 23, 2008 at 19:01 UTC
    You could ask the remote server for the size (using the size() method) of the file before (and possibly after) you download it. If the sizes don't match up, just redo it.

    Btw, are your files actively in use when you are downloading them? That might explain why occasionally you get an incomplete download.

      i didn't even think about the size() method. I was thinking it would be nice to have an md5 checksum, but that might do the trick. I'll have to code it up and test it.
      There is a possibility that the files are being accessed while I'm downloading, but it would be read only. Would another process attempting to read the file abort the ftp download?

      edit:
      I tried invoking the size method in the script, and was met with this error:
      Net::FTP=GLOB(0x8743d84)>>> HELP SIZE
      so I login to the server manually and attempt the size command , and apparently the server doesn't support it. It's a vxworks (5.2.4) based system if that helps.
        Perhaps you can retrieve a directory listing and get the file size from that. I'd try the "dir" method.