fredex42 has asked for the wisdom of the Perl Monks concerning the following question:

Hi all, I have a question relating to Net::FTP. I'm working on a system that will compress and upload very large video files over potentially unreliable connections.

The first part uses FFmpeg to compress the video into something manageable and the second part uses Net::FTP to transfer the file. It needs to be able to be run unattended for a long period of time, and hence to try to recover gracefully from any errors.

I use this loop to upload:
do { $ftp->restart($transferred); ++$attempts; if($attempts>2){ mesg('*',"Having trouble writing to server, on attempt $at +tempts...",$evqueue); } $written=$dataconn->write($buffer,$to_transfer,$config->{'time +out'}); if($attempts>$soft_retries){ mesg('*',"Connection appears to be dead. Attempting to re +-start...",$evqueue); $ftp=undef; $ftp=ftp_transfer::connect($config,$evqueue) while(not def +ined $ftp); $ftp->binary; $ftp->restart($transferred); } } while($written<$to_transfer); $transferred+=$written;
having first got $dataconn from a $ftp->stor().

This works for upload, but when I tested it out by dropping the wireless connection while the upload was in progress it would not resume - it simply said "thread failed to start, timeout at ftp_transfer.pm line 35"

Do I need to somehow subclass dataconn in order to be able to pick up after a timeout?
Your help is much appreciated...

Replies are listed 'Best First'.
Re: Auto-resuming with Net::FTP
by afoken (Chancellor) on Aug 16, 2010 at 11:34 UTC
    [...] upload very large [...] files over potentially unreliable connections

    Consider using rsync instead of FTP. Wrapped in an ssh connection, this also avoids sending a password in plain text over the wire. Plus, you can use a public-key pair to entirely get rid of sending passwords.

    Alexander

    --
    Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)
      Thanks a lot for that. It's a very good point, I may not be able to do it in practise though as the system is for a large company where it is quite difficult to persuade people to run things differently on servers/firewalls. I'll give it a go, but I think I may be stuck using FTP for the time being.
Re: Auto-resuming with Net::FTP
by zentara (Cardinal) on Aug 16, 2010 at 15:56 UTC
    transfer the file. It needs to be able to be run unattended for a long period of time, and hence to try to recover gracefully from any errors.

    use the c utility wput (sibling of wget)., for good tenacity in file resumes on big transfers.


    I'm not really a human, but I play one on earth.
    Old Perl Programmer Haiku
Re: Auto-resuming with Net::FTP
by Khen1950fx (Canon) on Aug 16, 2010 at 19:23 UTC
    Here's an working example using Net::FTP::AutoReconnect by sgifford. Subsitute your info for what's given here.
    #!/usr/bin/perl use strict; use warnings; use Time::HiRes qw[ time ]; use Net::FTP::AutoReconnect; use constant HOST => 'ftp.perl.org'; use constant DIR => '/pub/CPAN/ports/win32/Standard/x86'; use constant FILE => 'perl-5.6.0.tar.gz'; our $SIZE ||= 4096; my $start = time; my $ftp = Net::FTP::AutoReconnect->new( HOST, Debug => 1, Passive => 1, Bytes_read => $SIZE, BlockSize => $SIZE, ) or die "Couldn't connect: $@\n"; $ftp->login('anonymous'); $ftp->cwd(DIR); $ftp->binary; $ftp->get(FILE); my $size = $ftp->size(FILE); $ftp->quit; printf "\nGot %d bytes at %.3f/second\n", $size, $size / ( time() - $start );