nysus has asked for the wisdom of the Perl Monks concerning the following question:

I have the following method:

sub get_file { my $self = shift; my $file_path = shift; my $file = RemoteFile->new({ path => $file_path, ssh => $self->ssh } ); my $content = $file->read_and_delete; # attempt to destroy the object $file = ''; return $content; }

The RemoteFile object relies on Net::OpenSSH to make the needed connection to download a file through a wrapper role I've written, MyOpenSSH:

package RemoteFile; use Carp; use Moose; use Modern::Perl; use File::Basename; use File::Slurper qw(read_text write_text); use Params::Validate; with 'MyOpenSSH', 'MyLogger2', 'DownloadDir'; use namespace::autoclean;

The get_file function is called repeatedly. However, about the 250th time the function is called, the program crashes, I'm guessing because some limit is hit on the number of SSH connection needed by the RemoteFile objects. Increasing the MaxSessions on sshd didn't help. So I tried to resolve this problem by setting the $file scalar to an empty string to try to destroy the object and hopefully its associated Net::OpenSSH object but that didn't work either. I'm not sure what else I can try to resolve this.

$PM = "Perl Monk's";
$MCF = "Most Clueless Friar Abbot Bishop Pontiff Deacon Curate";
$nysus = $PM . ' ' . $MCF;
Click here if you love Perl Monks

Replies are listed 'Best First'.
Re: Working around limit to number of connections vis Net::OpenSSH
by Corion (Patriarch) on Mar 27, 2017 at 19:51 UTC

    250 sounds to me like the limit on the number of simultaneously open files for a process.

    Are you really, really sure that the problem is with the part of the code you're looking at? Is there maybe some other place where you open a filehandle and don't close it soon enough?

    Maybe you can write a small, self-contained program that reproduces the issue and doesn't need a remote ssh connection?

      Thanks. The remote file is never opened by the RemoteFile object and is read locally. Per your suggestion, and just to be safe, I commented that part out where the file gets downloaded and read and just used it to inspect the file properties (perms, whether it exists, etc.) and it still failed. Also, the number of runs it takes before it fails depends on how many other Net::OpenSSH objects already exist before it fails. So I'm pretty certain it has something to do with some kind of connection limit.

      $PM = "Perl Monk's";
      $MCF = "Most Clueless Friar Abbot Bishop Pontiff Deacon Curate";
      $nysus = $PM . ' ' . $MCF;
      Click here if you love Perl Monks

      Thanks, Corion. It was a file limit after all. See below.

      $PM = "Perl Monk's";
      $MCF = "Most Clueless Friar Abbot Bishop Pontiff Deacon Curate";
      $nysus = $PM . ' ' . $MCF;
      Click here if you love Perl Monks

Re: Working around limit to number of connections vis Net::OpenSSH
by FreeBeerReekingMonk (Deacon) on Mar 27, 2017 at 20:49 UTC
    This node suggest undef too 903706

    From the documentation: OpenSSH multiplexing feature requires passing file handles through sockets

    So how about the number of open sockets just before it dies?

    find /proc/net/* -type f -exec wc -l {} \; |sort -n

    also try: ss -s and ss -a (or netstat, but I am sure you used that already)

    What do you get for the following: cat /proc/sys/fs/file-max and limit ?

    what about a forked process? Could you fork a child, let it work and die... and maybe that drops the hanging connection? (of course, that adds to complexity)

      Woot! You guys are friggin' geniuses. Open file limit on Mac is 256. I bumped to 512 and all is well. Thanks!

      $PM = "Perl Monk's";
      $MCF = "Most Clueless Friar Abbot Bishop Pontiff Deacon Curate";
      $nysus = $PM . ' ' . $MCF;
      Click here if you love Perl Monks

        Looking at the code excerpt you have posted, it seems unlikely that you need the 256 files opened simultaneously. Probably, your code or some of the modules you are using or the OpenSSH binary is leaking file descriptors in some way.

        Increasing the available file descriptors is just going to hide and delay the issue but not to solve it.

        Net::OpenSSH has a debug mode you can enable setting $Net::OpenSSH::debug=-1, and you can also pass the flags -vvv to the ssh client in order to find where your program is failing.

Re: Working around limit to number of connections vis Net::OpenSSH
by Anonymous Monk on Mar 27, 2017 at 19:52 UTC
    "I'm not sure what else I can try to resolve this."

    You could hire a professional ...