in reply to Re: Perl and SFTP
in thread Perl and SFTP
This would not work for SFTP which needs an API for all the operations.
That bites the big llama. The SFTP batch mode (using -b command line switch) will silently exit if an error occurs on any 'get' command in the batch script. If I don't use batch mode, than I have to do some bidirectionaly IPC (must have read also, or my code won't know *what* the heck is going on). I'm sure that doing so involves some of the problems I'm having with opening bidirectional communication with the sftp process I'm trying to work with. I've read chapter 6 of the camel book, and I'm trying to streamline the IPC (or rather, make it actually function correctly). It sucks. I can't even rely on process return values to know about premature exits. I'd have to put the whole thing in a big verification loop, build a local and destination file list, compare them, and smartly modify the download list, lather, rinse, repeat.
I figured it would be similar to the ssh wrapper around a system binary. Even a Net::SFTP that used Net:SSH instead of Net::SSH::Perl would be just as good. I suppose I'm relegated to reading the code (it looks scary) to convince myself that I am, in fact, screwed.
First thing to check is that you are using a C library for the encryption.
Ok, I'll check it out; I was suspicious of something like that. However, I'm not sure how I can even ensure that. These are source builds of the perl modules, and I'm not sure that I can manhandle Net::SFTP into using Crypt library functions that weren't passed to it from Net::SSH::Perl.
Compression is already off, as the files themselves are already compressed, and I didn't think I'd get much benefit from compression of packet headers, etc. If I could even get 50% increase in performance, than that would guarantee that the program finishes execution during the available time, although it still morally wrong to allow it 8 hours to download 4 gigs over a 100 base T connection.
Passing the files to an external sftp process may be faster though. To keep the complexity down, considering doing it in two steps. First, use Net::SFTP to make the list files. Then use sftp to transfer the files.
Yeah, that's what I was talking about earlier. I'm building the file hashes with the perl module, but still passing the commands to the system ruins my debugging/verification. I guess I'm up for a rewrite, but it's a short program anyway. Soon to be a longer program.
... sigh ...
Thanks for the help, iburrell!