Jim Wang has asked for the wisdom of the Perl Monks concerning the following question:

Hi, I am pretty new to perl. I need to use FTP to get files from a remote site, the files on the FTP site are keeping on being updated/added at irregular time frequency. I want to use an unix cron job which run every 5 minutes to get data from the remote site. Is there a way in perl that can check and make list for all those files which were update or newly added during the past 5 minutes? Note I can only use FTP to get data from remote site, sftp is not allowed for my case. Thanks Jim Wang

Replies are listed 'Best First'.
Re: ftp new files
by rjt (Curate) on Jul 15, 2013 at 05:36 UTC

    Have a look at mirror; I haven't used it for a while, but it always did what was advertised.

    That failing, LWP (or LWP::Simple) makes it pretty easy to roll your own. If you have a relatively small/static list of files:

    use LWP::Simple; my $remote = 'ftp://ftp.example.com/path'; # All filenames relative to $remote my @files = qw( file1.tar.gz file2.tar.gz file3.zip subdir/another.tar.gz long/path/test.txt ); mirror("$remote/$_", "$local/$_") for @files;

    If you need to generate listings of files recursively based on listings from the server, you will probably want to have a look at Net::FTP::Recursive's rls method.

    If you just need something to work with a minimum of fuss, and don't actually need a Perl solution, wget will do the trick (Win32):

    wget --mirror ftp://ftp.example.com/path
Re: ftp new files
by Utilitarian (Vicar) on Jul 15, 2013 at 13:10 UTC
    I find Net::FTP a very useful module, it has the mdtm method which would allow you to get the modification time of the files, storing the last modification time of the previous run to disk, this would allow you to pick up where you left off at the previous run.

    print "Good ",qw(night morning afternoon evening)[(localtime)[2]/6]," fellow monks."