bowei_99 has asked for the wisdom of the Perl Monks concerning the following question:
I run my script, which errors out at line 54 in one of the module files:
I can ftp to the server manually and get a listing just fine:[root@host6 dell-download]# ./get-files.pl > out.log 2>&1 & [1] 10027 [root@host6 dell-download]# tail -f out.log Net::FTP::Recursive=GLOB(0x1c109a8)>>> CDUP Net::FTP::Recursive=GLOB(0x1c109a8)<<< 250 CDUP command successful. Net::FTP::Recursive=GLOB(0x1c109a8)>>> CWD Red Hat Enterprise Linux 4. +7 Net::FTP::Recursive=GLOB(0x1c109a8)<<< 250 CWD command successful. Making dir: Red Hat Enterprise Linux 4.7 Calling rget in /Browse_For_Drivers/Servers, Storage & Networking/Powe +rEdge/PowerEdge R610/Network/HTML Net::FTP::Recursive=GLOB(0x1c109a8)>>> PASV Net::FTP::Recursive=GLOB(0x1c109a8)<<< 227 Entering Passive Mode (143, +166,135,12,206,55) Net::FTP::Recursive=GLOB(0x1c109a8)>>> LIST Can't use an undefined value as a symbol reference at /usr/share/perl5 +/Net/FTP/dataconn.pm line 54. ^C [1]+ Exit 255 ./get-files.pl > out.log 2>&1
I open /usr/share/perl5/Net/FTP/dataconn.pm, which shows (line numbers also shown):ftp> cd "Browse_For_Drivers/Servers, Storage & Networking/PowerEdge/Po +werEdge R610/Network/HTML" 250 CWD command successful. ftp> ls 227 Entering Passive Mode (143,166,135,12,221,177) 125 Data connection already open; Transfer starting. drwxrwxrwx 1 owner group 0 Aug 29 2012 Linux drwxrwxrwx 1 owner group 0 Aug 29 2012 LINUX - OS ..<listings removed>.. drwxrwxrwx 1 owner group 0 Aug 29 2012 Windows Ser +ver 2008 x86 226 Transfer complete.
This _close function is called here:48 sub _close { 49 my $data = shift; 50 my $ftp = ${*$data}{'net_ftp_cmd'}; 51 52 $data->SUPER::close(); 53 54 delete ${*$ftp}{'net_ftp_dataconn'} 55 if exists ${*$ftp}{'net_ftp_dataconn'} 56 && $data == ${*$ftp}{'net_ftp_dataconn'}; 57 }
I checked with our network admin, who said the inactivity timeout is one hour, i.e. if there is no data being sent in a 1 hour period, the firewall will close the connection. I'd think there's probably something similar on the ftp server side as well. However, I can't see, from the debug output above, why a long period of inactivity would happen. As I mentioned, it seemed like it was about to do a directory listing, then got cut off.60 sub close { 61 my $data = shift; 62 my $ftp = ${*$data}{'net_ftp_cmd'}; 63 64 if (exists ${*$data}{'net_ftp_bytesread'} && !${*$data}{'net +_ftp_eof'}) { 65 my $junk; 66 $data->read($junk, 1, 0); 67 return $data->abort unless ${*$data}{'net_ftp_eof'}; 68 } 69 70 $data->_close;
Below is the code in question. While I'm creating my own directory structure, I'm creating an ftp connection for each directory and changing directories each time, so I don't see why this should be causing this problem.
Any thoughts on why the connection dying? It's happened a number of times already.#!/usr/bin/perl use strict; use warnings; use Carp; use Data::Dumper; use Net::hostent; #use Net::Ping; use Net::FTP::Recursive; use Log::StdLog { level => 'info', file => $0 . "log" }; #use Log::StdLog { level => 'warn', file => $config{log}{file} . $0 . +"log" }; my %params = ( site => "ftp.dell.com", basedir => "Browse_For_Drivers/Servers, Storage & Networking/Power +Edge", modeldirs => [ "PowerEdge R810", "PowerEdge R610", "PowerEdge R720", "PowerEdge R620", "PowerEdge M620", "PowerEdge M1000E", ], ); print {*STDLOG} info => "Starting $0, creating dirs."; for my $dir (@{ $params{modeldirs}}) { mkdir $dir; chdir $dir; FTPConnect (\%params, "$dir"); chdir ".."; } print {*STDLOG} info => "Finished $0."; sub FTPConnect { my $ref_params = shift @_; my $dir = shift @_; my $ftp = Net::FTP::Recursive->new($ref_params->{site}, Debug => 1 +, Timeout => 15); if ($ftp) { print {*STDLOG} debug => "OK: connected via FTP to " . $ref_p +arams->{site} ; $ftp->login("anonymous",'me@here.there'); $ftp->binary; $ftp->cwd($ref_params->{basedir} . '/' . $dir); print {*STDLOG} info => "Starting download for dir $dir.\n"; $ftp->rget( #FlattenTree => 1, MatchFiles => qr/\.txt$/, ); print {*STDLOG} info => "Finished download for dir $dir.\n"; $ftp->quit; } else { print {*STDLOG} warn => "ERROR: FTP for host $ref_params->{site}\n +" } return ; } sub GetFiles { return; }
-- Burvil
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Net::FTP::Recursive problems, prematurely closing connection
by roboticus (Chancellor) on Dec 16, 2013 at 22:22 UTC | |
by tangent (Parson) on Dec 16, 2013 at 22:54 UTC | |
|
Re: Net::FTP::Recursive problems, prematurely closing connection
by tangent (Parson) on Dec 17, 2013 at 00:41 UTC | |
|
Re: Net::FTP::Recursive problems, prematurely closing connection
by tangent (Parson) on Dec 16, 2013 at 23:03 UTC | |
by wazat (Monk) on Dec 16, 2013 at 23:44 UTC | |
by bowei_99 (Friar) on Dec 17, 2013 at 00:03 UTC |