anantc88 has asked for the wisdom of the Perl Monks concerning the following question:

Hi, I am trying to download certain biological sequences from FTP links from a FTP site. I am working on ubuntu 10.04. Can anyone suggest me some way as to how to do it.

Replies are listed 'Best First'.
Re: Downloading files from FTP links
by BrowserUk (Patriarch) on Sep 05, 2010 at 17:33 UTC

    If these are large files, I'd recommend wget because of it's continue capability. Damned annoying to download 2.9GB of a 3GB download and have to start again because of a network/wifi glitch.

Re: Downloading files from FTP links
by Corion (Patriarch) on Sep 05, 2010 at 17:29 UTC
Re: Downloading files from FTP links
by Khen1950fx (Canon) on Sep 05, 2010 at 23:26 UTC
    You might want to try Net::FTP::Robust. It will download complete directories or single file. If the download isn't completed, it will resume where it left off on the second try. For example, try this:
    #!/usr/bin/perl use strict; use warnings; use Net::FTP::Robust; my $dir = '/pub/CPAN/'; my $local = '/path/to/local_dir'; my $ftp = Net::FTP::Robust->new( Host => 'ftp.cpan.org', ); $ftp->get($dir, $local); $ftp->quit;