Without having a way to model the timeouts you see, it's going to be hard to give accurate advice to fix your problems, but I can give some general advice.
* When you create your ftp object, make sure you set Debug = 1. There is also a Timeout flag you can turn on.
my $ftp = Net::FTP->new("foo.bar.com", Debug => 1, Timeout => 120);
* A lot of people here don't like the CGI101 pages but their Net::FTP page is pretty good.
* When I tested timeout code on servers that don't have an ftp server, the error I would get is: Can't call method "login" on an undefined value at ./ftp_timeout_test.pl line XX. And of course the code wouldn't continue because the module failed. But one possible work around would be something like:
* Set a timeout for, oh 300 seconds when you instantiate the Net::FTP object but then set an alarm for 295 seconds just prior to that. Trap the alarm signal to give you a timeout. If the connection proceeds smoothly, then unset the alarm. | [reply] [d/l] |
my $list = $ftp->dir();
unless ($list) { ...dir timed out or other error... }
for (@$list) {
# process each directory entry
}
Update: The timeout messages are emitted using carp which is equivalent to warn, and that is why you are seeing them but cannot catch them by using an eval block. Just search for 'Timeout' in Net/Cmd.pm. | [reply] [d/l] [select] |
sub get_file_list
{
my %file_list;
my $ftp = Net::FTP->new($hostname, Passive => $ftp_mode , Debu
+g => 1, Timeout => 300);
$ftp->login($username,$password) or return undef;
my @remote_in_dir_list;
my @remote_in_dir_files;
foreach my $remote_dir (@remote_dirs)
{
@remote_in_dir_list = $ftp->dir($remote_dir);
@remote_in_dir_list = map { if ($_ =~ m/^d/) { (split / /,
+ $_)[-1] } else { } } @remote_in_dir_list;
foreach my $remote_in_dir (@remote_in_dir_list)
{
$file_list{$remote_dir}{$remote_in_dir} = $ftp->dir("$
+remote_dir/$remote_in_dir");
@{$file_list{$remote_dir}{$remote_in_dir}} = grep( m/\
+.zip$/, @{$file_list{$remote_dir}{$remote_in_dir}});
@{$file_list{$remote_dir}{$remote_in_dir}} = map { unl
+ess ($_ =~ m/^d/) { (split / /, $_)[-1] } else { } } @{$file_list{$re
+mote_dir}{$remote_in_dir}};
}
}
$ftp->quit();
return \%file_list;
}
The general problem is not the $ftp->dir timeout, but download timeouts I get. I must handle them in a certain way and avoid mixing them with other errors.
Regarding the Alarm signal it might be a good idea, can you please give me some simple example how to implement it?
| [reply] [d/l] |
Looked at this all night and my conclusion is that FTP is bulloxed, at least on my server. I first verified your code. I took Zippy, a Sharp SL-5500 PDA running Linux, and plugged it into a usb port. Zippy has a pftp server running on port 4242. I ran your code as is and verified that it successfully pulled a directory list. Success.
I then opened four telnet sessions into the PDA and ran four copies of a simple screen scroller to keep the IO busy. Reducing the ftp timeout value to one second; I was able to generate multiple timeouts while trying to fetch a simple directory structure.
From there things went from bad to worse. The problem, I concluded is that, once there is an I/O error, the client and the server become out of sync. On receiving and error Net::FTP exits returning an undef, and your script goes on it's merry way, but the server is still busy trying to process the original GET request. If I issue a new GET request to the same directory I get and error for the new socket and I see the old socket trying to complete it's request. I tried to to figure out a way to reset the server but my server doesn't handle RETR requests and ABOR didn't seem to behave as the manual indicated.
Here is what I was trying to do:
foreach my $remote_in_dir (@remote_in_dir_list)
{
my $arrayref;
# ftp returning an undef in scalar context is so not usefu
+l.
do {
$arrayref = $ftp->dir("$remote_dir/$remote_in_dir");
# This kills more than the current operation
# Could be a local problem, need to try a different se
+rver.
$ftp->_ABOR unless $arrayref;
} until $arrayref;
$file_list{$remote_dir}{$remote_in_dir} = $arrayref;
.
.
.
}
Note that I assign the results of ftp->dir() to a scalar. Assigning it to a hash is trouble. As in the original code, the assignment $hash{}=$ftp->dir() behaves poorly when faced with and I/O error. On error ftp->dir() returns an undef. When assigned to a hash this deletes the hash key. Any subsequent attempt to test against that key fails as the key is does not exists in the symbol table. You get an error reading "must supply package name for $hash" Much better to assign to a $arrayref. Then you can do a conditional on the $arrayref like do{}while($arrayref); or print if $arrayref;
I'm really quite flummoxed at this point. I need to repeat this with a different ftp server but I have already spent to much time on it and building another server isn't what I wanted to do today. I hope some other monk here can add more in way of illuminating this discussion.
No, I don't think alarms are a solution as I think you will still have the problem of the server being out of sync with the client. Better to fix the code as above to correctly detect the ftp->dir(), failure and then figure out a way to reset the session at the server.
s//----->\t/;$~="JAPH";s//\r<$~~/;{s|~$~-|-~$~|||s
|-$~~|$~~-|||s,<$~~,<~$~,,s,~$~>,$~~>,,
$|=1,select$,,$,,$,,1e-1;print;redo}
| [reply] [d/l] |
It would be easier to help if you posted an entire script, or a reduced test case that illustrates the problem. Barring that, I would think that a timeout would occur when you construct the ftp connection, not when you issue dir(). If you invoke the connection as described in Net::FTP you should get an error message when the connection fails. You can also include Debug=>1 to have more information printed to stderr.
my $ftp = Net::FTP->new($hostname,Debug=>1)
or die "Cannot connect to $hostname: $@";
Update:Although I might think that the timeout would occur at construction, I am apparently wrong. Sorry. With the updated information posted below and your code, it appears the problem is different that I had initially thought. | [reply] [d/l] [select] |
The timeouts occur when $ftp->dir() is processed.
I see it in the debug output.
| [reply] |