nysus has asked for the wisdom of the Perl Monks concerning the following question:
I have the following method:
sub get_file { my $self = shift; my $file_path = shift; my $file = RemoteFile->new({ path => $file_path, ssh => $self->ssh } ); my $content = $file->read_and_delete; # attempt to destroy the object $file = ''; return $content; }
The RemoteFile object relies on Net::OpenSSH to make the needed connection to download a file through a wrapper role I've written, MyOpenSSH:
package RemoteFile; use Carp; use Moose; use Modern::Perl; use File::Basename; use File::Slurper qw(read_text write_text); use Params::Validate; with 'MyOpenSSH', 'MyLogger2', 'DownloadDir'; use namespace::autoclean;
The get_file function is called repeatedly. However, about the 250th time the function is called, the program crashes, I'm guessing because some limit is hit on the number of SSH connection needed by the RemoteFile objects. Increasing the MaxSessions on sshd didn't help. So I tried to resolve this problem by setting the $file scalar to an empty string to try to destroy the object and hopefully its associated Net::OpenSSH object but that didn't work either. I'm not sure what else I can try to resolve this.
$PM = "Perl Monk's";
$MCF = "Most Clueless Friar Abbot Bishop Pontiff Deacon Curate";
$nysus = $PM . ' ' . $MCF;
Click here if you love Perl Monks
|
|---|