Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks!

Trying to fetch all files in a directory using NET::FTP and I am getting this error, I can't understand why, can someone tell me what I am not seeing here?

Error:

Software error: Can't fetch -rwxr-xr-x 1 1024 1018 120127 Feb 27 14:46 AB +C_123476_BRBRBR.xml: Bad file descriptor


#!/usr/bin/perl -w use Net::FTP; use CGI qw(-oldstyle_urls :standard); use CGI::Carp qw(fatalsToBrowser); use strict; my $q = new CGI; my $host="myhost"; my $dir = "/home/docs/test/"; my $login = "xyxyxy"; my $pw = "uyuyuy"; my $ftp = Net::FTP->new($host) or die "Cannot connect to $host: $@"; $ftp->login($login,$pw) or die "Cannot login to $host as $login: $@"; $ftp->cwd($dir) or die "Cannot change working directory: $@"; my @files=$ftp->dir or die or die "Cannot change working directory: $@ +"; foreach my $x_files(@files) { $ftp->get($x_files, $x_files) or die "Can't fetch $x_files: $!\n"; print "<font size=1 color=red>This is what I have now:::$x_files< +/font><br>\n"; } $ftp->quit;h

Thanks!

Replies are listed 'Best First'.
Re: NET::FTP - Can't fetch
by jrsimmon (Hermit) on Feb 27, 2008 at 20:12 UTC
    You're using $ftp->dir to get the list of files when you should be using $ftp->ls. Dir returns a long-list (ls -l) style of directory listing, where as ls simply returns a list of file names.

    If you do want the long-list format for some reason you haven't stated here, you will need to trim each line to include just the filename before fetching it.

    You're code, updated:
    #!/usr/bin/perl -w use Net::FTP; use CGI qw(-oldstyle_urls :standard); use CGI::Carp qw(fatalsToBrowser); use strict; my $q = new CGI; my $host="myhost"; my $dir = "/home/docs/test/"; my $login = "xyxyxy"; my $pw = "uyuyuy"; my $ftp = Net::FTP->new($host) or die "Cannot connect to $host: $@"; $ftp->login($login,$pw) or die "Cannot login to $host as $login: $@"; $ftp->cwd($dir) or die "Cannot change working directory: $@"; my @files=$ftp->ls or die "Cannot list current directory: $@"; foreach my $x_files(@files) { $ftp->get($x_files) or die "Can't fetch $x_files: $!\n"; } $ftp->quit;h
      That worked, great, but the files downloaded should be on the same directory where I am running this script right? Is there a way to send the files to another location in a directory above the one that this script runs from?
        This question is answered in another recent thread you might have seen. Saving FTP File.
        Sure. You just want to add an argument to your get statement with the new directory:
        #!/usr/bin/perl -w use Net::FTP; use CGI qw(-oldstyle_urls :standard); use CGI::Carp qw(fatalsToBrowser); use strict; my $q = new CGI; my $host="myhost"; my $dir = "/home/docs/test/"; my $new_dir = "/home/docs/new_test"; my $login = "xyxyxy"; my $pw = "uyuyuy"; my $ftp = Net::FTP->new($host) or die "Cannot connect to $host: $@"; $ftp->login($login,$pw) or die "Cannot login to $host as $login: $@"; $ftp->cwd($dir) or die "Cannot change working directory: $@"; my @files=$ftp->ls or die "Cannot list current directory: $@"; foreach my $x_files(@files) { $ftp->get($x_files, "$new_dir/$x_files") or die "Can't fetch $x_files: $!\n"; } $ftp->quit;
Re: NET::FTP - Can't fetch
by roboticus (Chancellor) on Feb 27, 2008 at 20:02 UTC
    Remove the garbage from the file name and it ought to work. The problem is that the FTP server is giving you not a list of files, but a list of files with attributes, sizes, dates & times. That's making a bad filename for your machine.

    Update: To clarify, based on the error message, you want work with file "ABC_123476_BRBRBR.xml", and it thinks you're trying to open the file named "-rwxr-xr-x 1 1024 1018 120127 Feb 27 14:46 ABC_123476_BRBRBR.xml"

    ...roboticus