NoSignal has asked for the wisdom of the Perl Monks concerning the following question:

Dear monks, I have a two part program, which utilises LWP::UserAgent / HTTP::Request to download files from the 'net, and Net::FTP to upload those files. The source machine is my MacBook Pro (Leopard) and the destination machine is a Windows 2003 FTP server. The following is a looksee at the code I am attempting to use for my file transfer:
# Pattern match all .ipup files in local directory my @files = glob("*.ipup"); # Declare $ftp my $ftp; $ftp = Net::FTP->new($ulservername, Debug => 1, Timeout => 120, Hash = +> \*STDERR) or die "Cannot connect the FTP server called $ulservernam +e.\n"; $ftp->login($username,$password) or die "Authentication on $ulserverna +me failed\n"; $ftp->binary() or die "Could not set BINARY command on $ulservername\n +"; $ftp->cwd($rdirectory) or die "Could change working directory on $ulse +rvername\n"; # Execute the FTP transfer foreach my $file (@files) { print "Copying $file to FTP server...\n"; $ftp->pasv() or die "Could not set PASV command on $ulservername\n" +; $ftp->put($file, $file) or die "Could not transfer $file to FTP ser +ver: $!\n"; } # Send quit to FTP server $ftp->quit() or die "Could not quit the FTP session. Que sera sera\n";
The thing that kills me is that time after time, the $ftp->put() operation times out. I can use FTP from the shell and mput my data to the FTP server without any problems. What have I done wrong? Why does this simple code torment me so? I also checked out my glob() by printing the output to screen, and that looks okay. The foreach() also shows that its uploading one file at a time. As an aside, file size ranges between 170Mb and 180Mb per file. Your assistance would be greatly appreciated.

Update: Resolved. If eedjit over here had removed the second $file from the $ftp->put() call, the script would have worked ages ago.

Replies are listed 'Best First'.
Re: Net::FTP insanity
by roboticus (Chancellor) on Nov 22, 2007 at 10:55 UTC
    NoSignal:

    • Did the Debug output provide any clues?
    • Have you tried larger values of your Timeout parameter?
    • How long does it work properly before failing?
    • Does it fail on the same file each time?
    ...roboticus
Re: Net::FTP insanity
by syphilis (Archbishop) on Nov 22, 2007 at 11:15 UTC
    Hi NoSignal,

    In addition to the questions raised by roboticus, do you need the $ftp->pasv() ? (What happens if you comment that line out ?)

    Cheers,
    Rob
Re: Net::FTP insanity
by NoSignal (Acolyte) on Nov 22, 2007 at 11:31 UTC
    Hi,

    Thanks for the responses.

    1. Increasing the timeout just makes the call wait a longer or shorter duration before it borks.
    2. Commenting PASV out does nothing.
    3. The debug just prints out the FTP conversation, and fails with my die() message.
    4. The script fails at put() each and every time.


    When I check the FTP server, I see that it creates the file for me, but it just sits at 0kb until the session terminates. I've found some nice Net::FTP code here abouts (better error handling than I have), and will also try a solution that I understand...watching packets on the wire to see what's cutting on the network.

    I suspected a possible permission error, but read/write is set on the virtual folder, and a bash FTP works fine. That said, it may be time for a system() call to see if I get a similar error.
      Just on the "Passive" aspect again (which may be a red herring) - what happens if you set Passive => 1 in the new() constructor ?

      Here's what perldoc Net::FTP has to say about that:
      Passive - If set to a non-zero value then all data transfers will be done using passive mode. This is not usually required except for some *dumb* servers, and some firewall configurations. This can also be set by the environment variable "FTP_PASSIVE".
      Update: You might want to try that (both with and) without the $ftp->pasv();

      Cheers,
      Rob
        Hi Rob,

        I'll give that a go. I've changed the timeout on my app to 3600s, and have Wireshark streaming its way from the web. I want to see if the timeout change verifies my memory theory, or if I'm just full of nonsense. Waiting an hour for a verdict is a *bit* annoying :-)
      ...just a thought, but does Net::FTP attempt to copy to memory and then write to disk from RAM? Is that any different to the way an FTP command from the command line would do things? I'm thinking that it could be sending the file to memory on my laptop before uploading the file to the server, leading to the timeout, because the FTP server isn't seeing any activity after the file name has been put on disk.
Re: Net::FTP insanity
by okram (Monk) on Nov 22, 2007 at 12:39 UTC
    From Net::FTP:
    put_unique ( LOCAL_FILE , REMOTE_FILE )
    Same as put but uses the STOU command.
    Have you instead tried using this one? Maybe the STOU command does you better than PUT?

      I am truly the world's dumbest human being.

      I had
      $ftp->put($file, $file) or die "Could not transfer $file to FTP server +: $!\n";
      thinking that I needed to specify the remote file name. When I tried the put_unique() method, it failed with an "invalid number of parameters error". My code now reads:
      $ftp->put($file) or die "Could not transfer $file to FTP server: $@\n" +;
      and it works.

      Maybe it is best that I don't breed...

      Update: The Hash => \*STDERR option is really annoying... :-)

      Thanks for all the help. I'll remember to RTFM and pay better attention to the problem next time.
        This thread was truly a joy to read. NoSignal, you RTFM'd all you could, tried a variety of solutions, and your writing carried a tone of respect. Don't beat yourself for the "simplicity" of the problem, it happens.
        ++ You've got the right attitude to get on here!
        One thing perhaps a change of name I think NoSignal may give the wrong impression to others of your ability :)