padawan_linuxero has asked for the wisdom of the Perl Monks concerning the following question:

Hello dear monks
I am about to go bonkers!, my boss has ask me about the new program that send the file by FTP, the question was very simple and a simple awnser I didnt have, the question was : What will it happen if Internet goes down when sending one file?, I went "well you send it again!", so this awnser is not aceptable by my boss, so my question is this:
Is there a way that the program continue to send the same file until the internet returns so the transmition can continue, and we dont need to send it again by hand ?
(is the same program that sends them the only difference is that these program ask you about the file). Please if someone can help me, I really apreciated, thanks
  • Comment on How can I do a uninterrupted sending of a file

Replies are listed 'Best First'.
Re: How can I do a uninterrupted sending of a file
by samtregar (Abbot) on Apr 05, 2008 at 00:48 UTC
    What you're looking for is an FTP server and client combination that both support resuming transfers. For the client, the only one I'm readily familiar with is ncftp. It will auto-resume downloads and resume uploads with the -z flag. For the server, it appears that wuftpd and proftpd both support resuming, although you may have to do some configuration to get it working.

    Even that may not satisfy your boss, since you'll still have to re-initiate the transfer. There's really no helping that but you can code your application so it automatically attempts to resume for a given period of time before giving up.

    -sam

      CuteFTP (Windows & Mac) can also resume, and has a number of useful features such as directory synchronization. There's a mention of "Script and Macro Support" (which I haven't tried) in the Pro version (which I use).
      Another great free FTP client with an excellent drag and drop interface is FileZilla with resume capabilities and suitable for Windows Mac or Linux.
      Hi
      Ok I realize that by FTP is going to be dificult, how about with a Socket having a client sending the files and the Server program in the server, I have done some work with IO::Socket::INET and IO::Socket, using these is there a way to make the transmition resume after the internet went down?
      thanks
        Sure. If the designers of the FTP protocol can do it, why not you too? You could even read their specs to learn how they did it: RFC 959. I recommend you call your implementation "FTP-lite" for maximum irony.

        -sam

        A reply falls below the community's threshold of quality. You may see it by logging in.
Re: How can I do a uninterrupted sending of a file
by starbolin (Hermit) on Apr 05, 2008 at 05:07 UTC

    The complete answer should take into account other considerations. What if the socket times out? What if server goes down? What if during any of the above the file changes? What if the file is received corrupted? What if Comcast sends you a counterfeit RST packet?

    RFC 959 only provides the API to be used for session management and the minimum handshaking for setting up a session. While it provides for a Restart command, it leaves error recovery, session resync and broken file reconstruction up to the application. These features may not be implemented the same across different systems. This becomes a problem when using a custom script to pull files from a third-party ftp server.


    s//----->\t/;$~="JAPH";s//\r<$~~/;{s|~$~-|-~$~|||s |-$~~|$~~-|||s,<$~~,<~$~,,s,~$~>,$~~>,, $|=1,select$,,$,,$,,1e-1;print;redo}
Re: How can I do a uninterrupted sending of a file
by salva (Canon) on Apr 05, 2008 at 13:54 UTC