in reply to REST API with SFTP Pooling

Sharing SFTP connections between processes or threads is not a good idea. Too complicated and too difficult to get it right*.

A simplest approach would be to have 5 workers, every one with a dedicated SFTP connection, listen for requests on the same socket/pipe/queue/whatever. When a new request arrives, the first one able to catch it handles it.

On the part of the code where it waits for new requests, a timeout can be set in order to send the dummy command if nothing happens for a while.

*) well, unless the limit is not on SFTP connections but in SSH connections. It is pretty easy to reuse SSH connections with something like Net::OpenSSH, and then run SFTP on top of it.

Replies are listed 'Best First'.
Re^2: REST API with SFTP Pooling
by Fletch (Bishop) on Nov 12, 2020 at 14:27 UTC

    My initial thoughts were similar. Were I to do something like this I'd use it as an excuse to play with Mojolicious and its worker queue Minion. I'd make a worker task which would manage the SFTP connection, and you could have your Mojo app implementing the REST API. The worker task would maybe enqueue a keepalive request to itself for whatever interval (and to be fancy if it gets real work before then maybe cancel the prior keepalive and requeue a new one).

    </slightly specific handwaving>

    The cake is a lie.
    The cake is a lie.
    The cake is a lie.

      Indeed, doing that with Minion seems pretty easy.

      In order to keep the SFTP connection alive, you can just set a timer (with Mojo::IOLoop::timer) for sending the dummy commands.

Re^2: REST API with SFTP Pooling
by Sukhster (Novice) on Nov 14, 2020 at 05:44 UTC

    Hi Salva,

    Thanks for your response. SSH is not possible, only SFTP - and only one session per Account. Each Acccount has separate files and destinations, so you can't share one Account across ALL files.

    Therefore, l need to keep the session alive while servicing requests, i.e. have a shared object between requests.

      I haven't explained clearly in my previous post.

      SSH is several things. Besides a way to run a shell in a remote machine, it is also a transport protocol that can run several communication channels in parallel between the client and the server over a single SSH connection.

      When you use SFTP, first a SSH connection is established to the server and then one channel running inside that connection is open and attached to a SFTP server.

      In your particular setup, probably the remote SSH server is configured to accept only requests for SFTP channels.

      So, now, the remote server can be limiting the number of incoming connections in two ways: (1) limiting the number of SFTP channels per user, or (2) limiting the number of SSH connections per user.

      If it happens to be (2), then you can open a SSH connection to the server, and then run several SFTP sessions in parallel over that single SSH connection.

      The interesting thing is that the OpenSSH ssh client, has a mode of operation that makes pretty easy to work in that way, establishing a connection and then running sessions (including SFTP sessions) from other programs on top of it.

      The following script would tell you if you can actually run several SFTP sessions in parallel:

      use strict; use warnings; use Net::OpenSSH; use Net::SFTP::Foreign; my $ssh = Net::OpenSSH->new($host, user => $user, password => $passwor +d, ...) $ssh->error and die "can't connect to remote host: " . $ssh->error; my $sftp1 = $ssh->sftp or die "can't open SFTP channel 1: " . $ssh->er +ror; my $sftp2 = $ssh->sftp or die "can't open SFTP channel 2: " . $ssh->er +ror; print "I am running two SFTP sessions in parallel\n"

        Hi Salva,

        I can't thank you enough for your help.

        Thanks for the further explanation, and code sample for testing

        I got "I am running two SFTP sessions in parallel" in the output from the code. So it looks like the server is allowing ONE SSH connection, but multiple SFTP sessions to be created.

        Therefore, how would l keep the SSH connection alive - what command could run? Would keeping the sftp sessions alive suffice, or periodically open a new SFTP Connection? I am about to test for the latter now.

        On a separate note, l have always used use Net::SFTP::Foreign directly, and know how to set Options like:

        use Net::SFTP::Foreign my @sftp_opts = (); push @ssh_opt, "-o"; push @ssh_opts, "KexDHMin=1024"; push @sftp_opts, "-o"; push @sftp_opts, "KexAlgorithms=diffie-hellman-group14-sha1"; . . . $sftp = Net::SFTP::Foreign->new( $config{'hostname'}, user => $config{'username'}, port => $config{'port'}, stderr_discard => 1, autodie => 0, key_path => $config{'key'}, more => [ @sftp_options ] );

        However, l cannot figure out how to set these for Net::OpenSSH; - l tried the following to no avail

        my @ssh_opts = (); push @ssh_opts, "-o"; push @ssh_opts, "KexAlgorithms=diffie-hellman-group14-sha1"; push @ssh_opt, "-o"; push @ssh_opts, "KexDHMin=1024"; my $ssh = Net::OpenSSH->new($config{'host'}, user => $config{'user'}, +port => $config{'port'}, key_path => $config{'key_path'}, default_ssh +_opts => [ @ssh_opts ]); # Returns DH parameter offered by the server (1024 bits) is considered + insecure. You can lower the accepted minimum via the KexDHMin option +. #DH_GEX group out of range: 2048 !< 1024 !< 8192 my $ssh = Net::OpenSSH->new($config{'host'}, user => $config{'user'}, +port => $config{'port'}, key_path => $config{'key_path'}, ssh_opts => + [ @ssh_opts ]); # Returns Invalid or bad combination of options ('ssh_opts')